var/home/core/zuul-output/0000755000175000017500000000000015147551712014535 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015147563434015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000342330015147563351020265 0ustar corecoreikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfBQ_M~:U狿h[.|yo~\n6^yzWc-b6"οƼ>UWm׫Y_?|uݗ[y[L-V_pY_P-bXwûxwAۋt[~ _P^~&RY,yDy~z]fs,l<L& " d :o5J=nJw1f /%\xiƙQʀClxv< |N ?%5$) y5o? fۮ?tT)x[@Y[`VQYY0gr.W9{r&r%LӶ`zV=Too|@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ?lm$K/$s_. WM]̍"W%`lO2-"ew@E=!|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧb1NHr~ٚF% _f%NNPoUbC,ߞVG]{2ǒu6 ־TN2k%xGϻ\}˜W *GYR--`(h$f%%AAn! J${4od9`CXQ|ɠ(# H-{sTQ #.ֳ;H,nfOf71"0mX%w8ʲ)z\W={?CfIN:XCB%Ӭ". `{2E2fջ0)N\Y{x:"-K5+W!FlÛ,j|3tRۦ/Ó7UV}NHFFnJ[5Fk7Tb*X, sKHD-%" Yy0!4i)aHXBR 8gl=UGC9ˆ*pA' kBG4>N}_ #{M75"cY{RJgÚbBIc!M5Ț \~]g1òi9 0.:nux}=^``k!ؒz\0{hOg)YcYh«׫5;z=anT E󣋰7=қ=|q~C(FR,Q]@PrY:^7'?| ΎO9ǯW] ڽEEKKˬ SMp=XnC`&"}هf 4?b|^M~8Q[cm3΄o*Pb#:ׇ^.: qMkлUޭ֐Y±lqLb^" {iK,6vT{'f%Z]H>ĻwQdkJH!U潼RbM(zdD7ڣ߾:I/ىYU99]mwzk]*I\̠unF􎿹ƪx齳K*`T%~a3&2ZC 5G5m+5%37{*ЧWZ}ƜembѬS=(dOq~Ďs.n[N:d~ y!uðI CJb} 32LAEGv@2`M;k~*9T};t;Kn0Q|07DҤ nZ:]]}n3͝1gjlWӫXgW{o6.q3QĞ5lq)I" $Mg~?OFm>^^ % ^vE0 `ReKgk|kڐMΒPO9 c XtQU^oPg]рdN)Tc&80M^EL'Ұ(~x9R X*R1"+ deb+M B)|.ZY8jz*d*)kd뾟N..*8SMRQߊs9B. N:HE wS\ Jė3>Ż(̮UqK<[1$%_5R5)w=Ob7_& l+ q5BHjLu _:JH-aDQ20ʹWO˜|yBYAO'dZܧdUl{MVVV=ҩ V WI·gJT7f(B]/uvW_|ߐ 2'[RlݒL%UBcts}SlRΥ!"C'*i?K餸K e8v;AzvU8~p%aAlIh?$N1#Y/u'H N_!>唢~Yhm``Dљ-rc0&蛤%QrT єh у}Ll` v$kn_Ůڲ[_|*|JB"<.e]2I&d,Ũ|(ÿ1k/&EJ){>4I3{g&ζ+Ҥrx ђ{L?z0US4yBc#_=cƶ52R.)L)pŰl+ a[s7 Iz Ǫzd;-s~K7M>%} -Of;M.~P 8'k01Ѥ1HIa6Pn{/2ΏL+ΆhBUx5|T!F[|өSfFH.İd/D!-Ɩ:;v8`vU~Il2;VI]|Lu>$X(6 b ?u==O!;(>hǖVa[|oiya+CTm>C9|H iHe"j.S֔(*Cj!);Sak*ep~K1 v']7/.7 !ې: %ƶ(f갱/p  |T!|ik3cWW/ @a#ӸvZ{Ibi/b;u8IRXAV{ύԦٖwŅjIL{3#iyy >Vc11*Y0\N*HƽŇKoA`d;ɯw"O-]J"ȜI*DۂgؗN^saͭ̍*tPM*9aJ_ 3IVnס|< aUd⧘pvzz0V fN:ǖ9dɹt^dnJna) H _KӆX#rrE#r?uQ { xRF(߯y? jO]5_C!l]>a55[c&-W`a}TQX&mw*Ǫn\7{ctm,c%jP˃m )lwۨKqu!*ottonY77ܩJ==\J=]?Ww?¯8nq~q?A-T_qOq?5-3 |q |w.dަ&/<_ DVi^9 hxh2 Iz b.E)͢Q l1:YɊ",8'`*>q/E :Xd,RLW"Id9JogT\1f3@KuJ&@B x,A k ޒd [Yj-Ah1T9!(*t 0gb@񺱥-kc6V'“5huՂUmpa.% qZBh]Q; 'd:|ؒ3$".meO>Y?HELkYZP=8YAc| w#Dr) "h l`2@K$`#NXtJ^ zDpC6-]K[r0Z;`^ˁ-G$\~%Q;e{/d ^ ޒg0uE~ۊ$q9`尻]T#CJ1Ǐ9?M8]o2seXVt=ev!`JU#y8B*kM0{'\ 2n[{!fRБBmLaKfKywdb񱍠z{(.>LC,HI~'./bKjoJdpH UDp.cj|>z '` |]}4:q!G`G qBPu(DihU9P!`NHɩ݉S-^pşCx$BBRoJ@ѥuȑz.#&UݠmF̤@U8$ M6MY0/r: *s5xgs͙$ԙy#Ejl1#XX۾;R;+[E&Xi>eIi5lݍ )`8dM-}\\%.}T@ iS*XK!?\+ xJpΕ`p~mΖK Wu7Pll{f_WJp)h9U|A܌%`; TnpR4޷V+vy]/ϧ]+GЕ5҇#t~T)=UFEnvD8cRөcp6Rcc6:$[|038F*0-)ZyT10:U[tp޴}{~Y(f 4[m6F"roe5$!;VfBs˞ޝ4cc1ۀs`*f'r[8ݝYvjҹJ+0v yg[~)5 [j+Ag"pZ:"ka]+n!e߭lɹ$k'9~ J>0E8bJDƖ|e=rv:0e7>& 2ovN21cEdA Od[=jlV#XJ|&+-T1m8NP٤KX)tr:mDWx"8B*4*X FQG>^6 vq!EwQű&؁64Ĥj9| Οڭ:kg wa`e[GX$"JX!8j0"| \56cdʰHdX?"}B= -/%!C`@ шv1\h):=m%랹m RD3Q{]pcfՅuБ:A ѹ @˿ޗ~7e3tj>Y)"\**vdP=I6p;bck[ RhT#N0d5+A>ΰ-8sѹ Ve掟^ CZQc~j\b8$4kJ^آX/ 2z .}'1"+L=$ÅjuƖ},X n*[hp9 n`g.  RG-m~\y[j_;3\弁^bD5p-^〩:w}[ą8dBմVsrAJsT=~#0t.P*2V q%so#r|.v\sfa:\X%;3Xl; ՈC.5Wg󵸊y!1U:pUC4Cmp-7t]斻38ѮIWί_#z7u&Ӄcx-w+LX)w>^ʮٹUg:lR@djӓab u[kWw{7st28bJ0U1|z:9lX)zS&QsTomDvU`tiz5Ӄ~ 5yx `iݗE@Ubc@ ۾S6p{dMVwfa}/TQXȴ7Ij.WU}-I ux1^_VgϽeՠyq9Elq.Es[ {DhfNò* `(0MM#Ͽm幞gHrmV_>`C`}c|?򤛮9//+Q97KlI(lxI̵a /E&*%Q _'Y,yHzf[lPt1)? 2mGlV,mI>*ib?Qi%Ëw ),CkY&e ֊Rvg>3b\o96B |"uӲlǍ o8ܱ@x84o<O3 y g]4|<(UwGH9B:lMU2,Hd^](xĥ _l= \\qXe{~K|ikִ7Ʉ& (DO"QSѫI5)x3PrR-deL^I\@aMBu1~Ue:9bz= Yͫj֮%X*WlrZLZxSAZYmHS-j=,?*? Yw;̂lUɸ}GɟIK% Onĝw?YYu\4]4%$sp.P0JL'y!vAϣ@^M:B;Kxy4%"tuL~>LiORvpK"ydA#z&++9NT`{Ý.|+i/ DeVGaOKP΀ȇR6Y Xǚ,UkWȰ}$(;O2lv0}Mb;c9ΫdAs~PrV,D)NN>R$et%|C>Ud@+w1:opq3͟g4(W<+RA":;8旼g S&߁Lu7[P&>9[u٠To_٧yrjYC{'~L!pu"h1xp9O?;"a0t}W& !* OwsB38n1@og:צHGtX1{w-BO?J)sC40Iw2쓧DY $gwQKb5ww9Z4|p%avWeS7/D&OVu~xs{CT6Q)"t#w9ϒsIv ϣwx ""?@]\kZ6%(A-)s0GA' {n&B*}^8>"2lBu$xW"::zb-yuݡڋVdl[Z.?Q[O2Z !(B7Dt/fIPUs޻CBB,]lYތhyfpy@\o|x=4EhiMS 5"ҚFҰN_躛%2dSix7aEKWP,oӲxpEK, &kV,pD-/D]`!Aӊ(@k`ၙ kȲ l_%?mW#ъ!'ix,*A`В>< g`fԝJG2ΓtR-4]8@ud>Pio;muho׀p\Ư94*> v_OhFm)=o9,,H3gPZ7695-癨 Š\xŒD[h!z홤$3CeF JUCfgč8Qe47[κnZsj=T--s)GCk'Ul)"Uk6H_C˶i.*c ϳʄdt>w1UcYB!n<85@O/ouCE}uД=릱 zDR,RG OdQ$bޤ!>{հ6.xK_w-#'%kkIdۚz'k[ nJ%ϫ{睚u- ȳW,3V]%D^l8|l \WcߟnZi-k zN't8iU'iZ2b-ʖlu% JU*bk#rM0TV L[ȦB%CK+dI DjkYR 26m-H覆g^[3hU1Ŋ8v⡿e>$C<j5H*Tjo]?&SCM6Z>9"{"[[NGjd< ;4涄Syy.ͩ,goR4"ڕLdbz}Vrw;pP V_`6zf^naqWkM R JkjLB@PS)A ")OMFG7q DmZPW3DҌ=y)KMqƖ6uL]jSN j6Dۻ+ L]5[+`Ǿq>':OrJ mة+>ZѮ$=sgmbӵf\Ot2Ӫ+ѽC}չxMNntaDFA#?0f k_]< @Rb*X i;+a 6z律cF4E0G$Ɯw zHōI:)c)[92EUSb?k:.Pc[Q ;4u={%DyS=$yջi 6Q;3]uޭN7tKKV1j3VJ+94-Ytkz*kYTί?QƔӏX3TWĖuZF^Ձ?FARZkLb"f"xoXg͵lGf8mu34>ߜP|>xF&fl'g6灯0lB10h`ػ!uf{\l”ou_#^'ٛV!܂9M1fs `d0M9Me؋ϙC9/x792b,awG2s775Jg k{1ށ)ݽ\lo2'W K9HL?,gzn5!yVo`nU̳=0 C8&ʠAg0ϱe\s/F\Eњ/\'wW6m<ӑyxy.: +z׍h=U|{R1=|o17n/ m Pv/X弨41Bq9mݥ-#Laog|NN=D `2f?{ܶ 3גގ8uk;6ER6ԒsdY :fvoF 9x'Eo)u7a< |7`iP]k=OCKg=Oc7pG\{}y"ZYGdQJ_M|кֻƶqtLXy sgk*_ mޒmK `]{k!Ұ)S_?:`q-Χq75Ot d[s o87꽔.N \݄%_{2 lBG]3T{7{0rO׀6ٛfUSBWm$kj f}?$+z4Qp+ rHF:x'{&1cx>Jmj"]j/CeÐcvd$o8Qx12PEoT#/ e!ijpu1F~VGBh)e^#& Nۂ?$oaoZ=?R$W)i{9E嚡d&2E'a ",|+ʚ}r ɸͦ?E?SM 5x4E+aKUe'Ç5,Y8%g]0j^-blєks8[ :GmYL8 lh g$,av^+̍DL!\aզWlhyQ{2X LS8tsKՆxpq,rkMwQ o3q;0n>I0kGZ+z,M5Fr,F<9^4K#Al ^| 07c~"qɚ}fS\8GQx6=X}`gų۴cGoYl~ӦX}ؒ+O[X~}zÿzo{ط醧gl }A}6Cw')I{ &ӥ (⿑(W ?/ߢ=\d D{mQ-틭Vj.8n6#6y`P5>֖AN>W"꺢w-~LQ[8m SJyЄIIBj&-ʶ mO({lGB6t)ۦOʞ٫lne[jnճtsvݦw2| bwoAL#ܑPs BeB-'z֎Z[j/joA=w$ނPL؂P=y [,lA=uv$قPwPw B uG#z˄z[mOc߹RX9 4?U3ޟ9ByR$IQ9,VM9ݣ Ơ~WG@WtzUm0ϒ 0䢎Sd0ͳp R؊}yhK4Sp'NX~I;Za .̷0ʯ-NuU] VXơ L ץNgpiC0;hx>DK/B$32 c(II}0oNc( %IEyk4b ʅ P @im'6D~ETXܐڲ"[K)# \usXMV=] 3`oQ0CMag4XF-Oݭ:u>|3Q1C饇+3Y$|]nx>Ntx~?0*iӓM" `6#Op,1ǃ!#`PX=կWY Eqqp`<c =p14\qcB b t8Iҳ8DuGxŴ}+i~o φyπYLrSRz0.*r,T.ղT텰{U)|'2r\ZV W,XtRN̒GYtDů$͢ &/ \8|{.ca|D*^ZʓY}n8WƛUt\UޜH<*R46l&5"iI!SmhZ.KzBS(k>Pkad "*AoY);G jQrKM<˃ZY%@Oձb,yY62pc9Y~SOkR{kvd+8x!jFt yRqo]f3bЁr*7h_!g4mppmd(i32ĵf߾1O; A{u9_~gÃwlyV{oQ?~̮p7(bu;S3LB;)a6/yFbq/CpIyNʴ(R&))N=V_V.|{}ן?NO?11%xN s盧5Bۺ?Ss6{tɟ~uO9v 2}5Gu ; )NGlM0u75Y~Z>؁sd;}ku~?d^/x"+̻g: Hnf#"9=R^OO]2Sک6'Cށ8ej~Olw_]x @Ypc;s-qYoqCaPCX.Z¼)^ 躬C?#x,Ij(p iwDLh8*joq'#B36oIpcqmI4YSM]ሑtc#GbUIDV}uL ootuVc-H""c$CjI04a("5Wrְ̘g10ڀ#i_}>cG^hO,)t*g%K{$uJKٌcd%μ.j=7#$Hb$Yw; zmZD `R`MUC#mnVoө$ώ=~-(y s >ccK~-HplYIbvsPeE)&2G#JmsPaaz|, +$rec73̓Fz6Ilu?1LY\U|I泏HlO΃$K%SKHB 1W؎ NVD';W,&V'X 3x0LcϠ5z X))X4`1Bei@+G96p4 j^FÔB cP8ED;O5w>(sxڐ8/ 8‹ zeۨueF o3 L(5:0cĐfieg(V9\>ez5F,Nb$18߾s{=2BTQз=8XF zE} Zȥ1[]c1X8]V4gPi!$VL0@9J‘B$278M=&n;S79uհ2A[11gZR}SXq#˜|>gK:^<jVBph!Np̽b$їʭ9c[ ʼnD#{i:z{Î$ArmD߾{0L}28tXRsazh6Hl+Owl%jVZTfIWJ-5 F(I9;5wtQݽ_kGTڋ:khOER<% n21lz']=Vj~H*vp,e+)Y,))RTb诵KmXM DU)ru2!0N͚41fzـ5i/.GetwŶ씿̽俶)RyicU"Ѹ\٣q҄> O5}noK^@ԫC37"@:$kс#lee.xބԩ9̹È93~0@P3=AXK 1R84;L' 3>ᜭV:Z`.q,]Y%8P\- X yu@gnk`gXX(xp{YO1z JxH>,G3֧Np2g5* > ){t .pv$żU9}ԭ\ ^^t@| .Gg` yd5F!tY?6Q4Yjc7T3I$!^^Cu NVA 3n]7E1ѸҴY?3F\I8#pt b5K1U[D;Fҥw#!qEYJpqİ[[DB4[SxifAua$]dz&яE6ZU*LoiҰ}A712U2X q #zƐPXU2xaͲԡ5H[0.qiMccVk%kIv2!]&BF#=jKJf2I#7vƜ\famNW5='Ck :X0i':+FL$Ӆi5(qAr~]ɩ5c78dMv%!Inj<{t~P (/9= !{^L|[68xy}/ꝢWg") 8X1#oSW6 pTAkڲL!nwTR1iknُ7vLF8SvWZ.,]6sm0\-*SЩ΀W0*RF]]*kr5X =v\U-|aQSZwݕub ]9Y#A\5/n >\޼N`ң3F.pat % R+UT(tq# n0KJ-vb'hW L>$ƕyq1cx+"H޴L9Ա9BWH|X{eI Wpn,нU7"є O $ u\38}i cJNʫ3 Bd\0 {.KvwBP(jW1%-x5pᏑtai·ő8aڍcjwv/폫jǃ.YAJjɑWSVPRC&׏qqJӇ`lQnc+Sp(Ij Eh&xgǛ6]E53xr,9ɤ6dHu47ʛ>jLpEqp]ۣNDЭ:N5{$ة뚻?E)[INe٥a:|!t^\ABR 7 ^NV Aj8Ш!LMře0e)&x?ЍQ_I+S\R)ۛumSIuJ0\ZiGqf$)*$5$őT%>FUw]u4~; ] .sFh$qf$L61i8$9f0c9t܇ouG[UMF5E&nNps=D2%'k뜌O{iq2h'=Y\RYj';QRE e,g^ \ꖪbq 1 IM]Eݎ.{;QQfHT`#QX"2BFٵ ̎h"Ua-c} ~t 5-t|NҳޑPn A^\z׬Rl3ۛaƓFjLߵ 4/%$ Ub{"HL驺"\wbZ^d>$Mp/k!=Ҧ-$f21_0}*1e'AGI5dquNxx7Ͳ{#8fv%˼| ߘ}Cq= v4MhP0/@{: fhWsЙqǹ灧1*u/fB߯$7 ,8 Aa#es}[YpE/Y$Dc< wy}5%$l~Ŝ`Ykp)\&0n#Qj8QJJLp88pAϯ{KHohz ^S`L@%cp)^zz|3I.G?ɕ%?>S}oz˫n.G)48_6ka|9,l8`s7Ueax2~\f&|*uW ](#>y[>)n=޹LFE4x'༅ hx8 d|RV Ў/|r lp:?Cn`$;J:(\x7XviM %j6(نXj[ !"U(BRÓXs}ɓ`҆Nqڲ鷏%-N?&$;~o5zcCpsPc[3)bUt d y rJ9XAkM=/&Q),ev& v?Ra-r`(Y;p=?cmȵN'g)c2G;0 ^%*~1fnPKRiHŏ|߻Tz0 A5*KsWJļ$_<3a19IIYG֒[D0NP!32c=ohw̏~z]zh\fw9:tߖGfcX}88p6vyf2V SfR/6|=BYXVJКqB#ڱfH^yFriR[qko^\Y8և, $fs/E\*N</?ٖŊPy.J6Z2qyXXiAIFJjgp8e9## ;*ɹXYU.=.°;hjfs5H 2{׷lDd[xc0xoa\~B0NNocjEiZqjޚmL(^N@qBU%ൿ-4P_c8G(}wv VZ5aŷ(?M͝J اT\ 5&;yf1hGcrQ`H+L َ}k)Z.>A9-˜5O yTiݛw<$Oo[ނ_Z+^}pZhXmM]A>È+<< A8;('Pmkkrq3Y~_`(=U'l (ՆS?lf>Y2+~>Ft%'Uwԟ~jq;?Bm϶p)Вl[{?jlZp%/M%A`բ nO9 /yY\m=+G|v: g^ãz\cC;GL_@ 5* bʖ6jҥNt *ۙ,அ]L{3*Ѻ78Ũ7Ǽ X+uj~qRJLʹE[]ꬴ|?|UyV-U%FPq h AiMIo߿pnE0WͬhՌFjķMkPjs .n䂋0SZ y+|zR>Ļ!$ ~u`n&yޏYȿ'b.j δIT[@#oOlo"!~3StMzhHjqcPs|_}sU(E<7Db[8ior;jN_|G` YJ92U;`>n ^D?H{7Zwe>c??)v.vCA*#=z;oG~﹙\v{c9o^Yd9r)"(_ #pnF @7bu07.AD#r\B!| Љ`)I8l@K\7MQ-wtĎ }х;.I@g&05I^@|9'$Cy Q@ E-r`8KsUMFbf-KjQJkG r9iIICũ da&+BH}xD>UFh$4WM6(mjeU>\ zݬ[28yٛ4%SA}B,w-0t\Ml.k0LJ^`4C &h&w l<I1Lco$U |&|'-;U¡ |F-;kiRMnqysCʕqL91ER"9Wfg sP8,SB2Ϲɶw=N2mkm|A|m$Fm`qj  \/md l՚Ct/i;՚+HMi=䠨5 9(tsP6cp=m牟'?kki-((`J(}>MvK@>R{Nθf5>S\̧#j(fg)!e9ͥw7Rdf q8vNjc0HjάQeOMH@ VKSl$htdfR] oH+<{:vw[a.3d A$ؒV3 ߯J-v$E;E~]ꮮ‰FsaT(%H` -LR$E*vSQ9 0_Wh4T =`W9)70?XSh Ґsb T5 #m7R: yḥhFC-‚phwdr'5u4'DӢE g hMF7J0k- ,!  jEH%L˅K4fBkHM0"5-tAq: a|" OlE>Ev<2TjL3& 7*s$bgyʈ 0VJq%\NA[Rd6ӌcò9ˤ9DJk L)Qޓ%P6.VYWLYذ >2|li w5-ּWָZ0i6 DH%3iJ`^0B&e|ZFը}Rk_Q)hBEeZ`=J*.LԒe6p:xbLҢI)LZf ңtŨ"q2 cN( @-( YS!TgVTpxEai-Yؚa lP`r-=&!8, ƌ&΢UDcK<od%ZEv>F@v] "~]0  6Z4$5 DfR0_DZ54R ~)tr 9EKWP'L)m3 (9il94%Ml*R"ia >I[`sLZ/^B itKR}s~[WpS1Д,u<ͭf8\t ȍ %eJBeTe$"4 2,U>te62\aӛEՀ!Dꃧ#60[+.mUuh6;20(XqH-^ -?(my"Ƭ+Dށg/gn"w?m3kJC-rF,1Q[:;'X8MYʝ@Jr1</;T蹷T&M9jKɄɼ6$s iT3xyFmVa8̶%`$"ٖh\L*b3 LKfm+bװZ7qR_J@'Ho!`&hk+-E|4b,;_} i$!0We=0"z"-u]/dA!~a9xR0Aa.5a-},}0ge$,!L`/`Y)BjЪ٠- "o:n }Վcw\uyRlxt9nj'&HgO~TWE[7l]׳L)۴O r;4׸gIԿD=s.wy͋2cmՀ}=aL-99=Q~|.?% rJ|>oƻb<5P F$#BiNHj~@V97nz}9bϷF醌^Y1k+&pƢ۬0& +kLUUm5ͽfG77|~ _|Sx0/!r^rz2/K|F-|S>c}Yo'O4S?rJ)F{o~/.tƺI?Mzɲm8kE|bvG)S;:YxA`S_Vtž1s}UM'QM?#@$SK*ŏ^NѶ9Gܞy|yqm3O/ԧZ}r<^"R&_z͕VP΍~e/+|!GEr29S'b^^{ƣP/n b,>{5DȽ2jUm#{?e0R5~5lV~ ۵r1aj 2|¼g &WG;9Ëg87wgM|pٯ03rm1hj  e? P[342k m3qRpTs BSZq6DFhT7t#6ۖ] {JO i?XƮWуMOVժ%_w5kը5%64?մamJ v( v!5NؠEM.eVqdREVpaHԘ_EY-DUWJ쬔r%$_cqnE4EEa+s/&v+ٯ(>~3?QO5Pz3цM()”vc¼G֬ 쮞@\ᎪIKWOl]rDwHl#TcB) .Ш^qs͊EaWܙP&Ꟈ CWL([Jv'%Bä9?+8yt?30WMBpyT?74x5[*g6H}E4Njm:{̞0{ְ-- }oF!H!.^.jRF/ W&)y⇽0CjDfj\NjW s -< սnrtBy#pRR!*Kjg\:%pm@ٰ}8tNO>CK OۚžfI/4tU$ۖa]Ig ݰ' W}; oFרɥ^Kݲ ?_R*Q2]K%vVVk &?;tkN1M4InUgY6NVnD>ӺW_׳"&;A0v AGg3蓮߆@6n3"B`R"F_" kvGBK\01)%5xdTT+ŝ?@t6<Nՙ3g@-qTek\N@ʲ\` j ),yN|ϿtN7s?{Z7}ݦŹ|1z-K@o(za[Bdp_SCK$AsFVXfHTJm}|d}i5X(X-'{ .$߼_@A+gkqf/e!!ۥq<9 *#d+]UOHǓ[30/ȇK;;1X+l2/)RL?5?WZ#̞=W[S.(!مb[V,W&Ͽ$\wtIdOBʤpl~RvOp򓡊Ԫ}ۃH_IĪ4ِ%# DCZ+VD3S˙|w㨗SbszDJly [֞u)iC!GSpk8Q)^zP/I}ֵr1m*r4(4i cj%)rv4(3vv2/8vcR$6^Ѡ,  TMqCF%QbZeߝ*eGwaCax_N2Qܮ,>kvfM/LkkBF,>kvAhkP/Bs ) RtnoF]QBQEnwXEU3S,ѹ"Wub0SCsVH cT?{g7r"UDz0nwHrI\`\ܛdi +K:ػN~ődIÑzE. hjH_Y]mo/CT-R%OB}U[m˴1J*"3)j{_%"g%R^>xK5ƀ/#ŗ/URL۫d?UY$V[m JTqN" V{}.mY48. cD(nLI5Hj{_8`&RѠ2#Dk{9-*`a.3TT E^&veƁ4FOZmo/J 4 DJCVrm*zHE6Tcɡi{9ej#dj7&KTѺ}n}w\+G'r-IwMϹǧVygibs]r;'bb4дo˼t+CCDpac{)p \z +L8W>Ol0aa~y GqΗ˃^SC.d/DjbjŤ"Խ pˉrꨉԷ$9mVa# mh bT#\7ET 23r7!wm`NnB?sΡNnͦХ(:j@w7 cn&݃ᆭg(1c"PM l:#Pw*Ƕou,ֆbko\m?mb {,]ođ>Can ¦aQRI썳fuRIg'?11lAY?0Mv]cgN2iN"RC5H$4 va)oFB%'M;}~\ 18F*j:\plPٰ OvF_J{-BnR`EQpMLʹQlBš25Fmt49eWE 2xδߠFtu4 ֟؍%ƣĈBL_6Nn{B&F@E1LȓAf#d-5kdXks!끌͑}AC[Feŧ++u՜]gߟH(,L(+L (xwYAw˲4BHQaim1#A5YrIG83qt\;/H׮A3f)Rrqu}'ɝg^1Q!'Al2| &-ܦVr-n-Lݕ?HȑSkA@<d6BF=In;$ FK^ȁJ?d>ZF1kH !_&j,TT:ʻ Mg9_ -TLϨe$!Zo60@ @J ̆%?3dIvJ*Kv˒i,7G`<HdS \ʑ'=Hlubi;p{lppI۾2] )γM }8uR^rm̗ Mӫ]c pj|(%sE 2)bݔk[P*型r~z]eXL*|!u_1W<=_r:{챋^g0OOcPyE ;Wi7ABeLvMgm[>=oxl0 "R ፑVx'V7➋7q qEW› eS߷Ǿ| i6Aǣl\AfC+7\CUpC ~am?9nH{UFAf#u6F3[^jV@mcKt Lc=j*|>ݹrm}{W|:eq |el` ;Af2\ (&um|s]c-&=]1p>l#.؁وH],aGACkCHuQ-潫 ׬V7]1ӣS vCݩ'ljo j !Q))2A3,"L1=*'n9$>__?6˿~th/wxrQ<<ƽ?3d d;\x}"GW[uxP:m$1.QRc692!6bd'b텞}hZg]&2/t\2>ۢ\(jRd62PKKY-ŴᾖJTz%k b,ށTQ,I 6pT@uT+TwA^#(DT |u]EC9CuO*ngP%H @%И2h0:@;6Vh<Oo{7H 21oQAk2| B7P{m]SB]ɱkW줣Fs1bI la h1V@xXra!3Z༹c pk:|eeƶgc}lB5=ALm-q2nL% l W:ڶfG1?9%ni`W #.Bd >>#r)!U7*~E] "`&:ZvՇ‡D{E,o/Cd `DHJ9<^4,_?QKQ+)m PUd>rU}۷抪Bս -p1k=,z?[ (|WP\wix*P;zP~d"Ab:l@*3h>u=o:<幷^>\[ScM ^^x?̭ڎkX> ٲ-VuPR_n㮦57+9W޳!qwzb沦F3p0.XEyA~vP^7Pfʭ䓡s7C>"1Ͱ| \ PAq-sSAT3=z8Z %2P Toa RA:9R7pAW (bt0&˲0ojT0CB䨖k;鰎F}UAو{"~H?EaӍALl . e_';MM7:L~*f-9ɣ%;\rm6=iq>br(:?atk*# 0[f//&F'$79%MϐVG0tU)`%ҡM) 2\ddZ_-S:LƔ}fBİ^\H \d>B& 9IU[\VDؕLH!OX]yHkr6{L(Ab~eku#]H5rOǘ ^fr|.,@l/8a"OǘnW,;W-dGe[}8b8qe2\B.*^g; ` ,k/>Ēv."R0Jc #D^d6!yqJ6IܥyyȽb'cGP_zhbJ߇vλo\:0-;ߣW/y,PJNzə5_ܬ#a5PV&̆+\9yW$Ԣm\)~pUbuv֞t_cOQ^Du F8 Ң 7Hٱ]_pAWB3o82r B& ?)Ew760If6#ݜn6q[{ `מۻ;t)?{8vøm:3ۘA.FC*1ڱ`+R-?bvىftbR"Y&0DЂZ@nkuE ܃2$$lu?ٚ=6/|+@dpS{0I6?k >z,`B)xښ+waZNҮʛIT4Vĉlz_[Vcc #X ta *߶;MNw88{;)jg׊fygFM5/k7vf+8~%E p6q4#i ++fH,ߞ;4|JQqekDWku&tJ>F"vJ+Jk*6wVuB hPxeBC(xAB)Bi*7wV hũB(k"BMSؘ M̤coNs Tݭ1LhC^;+T!OWv F!L!A"p1HA|-jy3&Y^_w}uPU cV09NC;\松50Z{[s~?_o~c4O z 2dZ8F)4qBF %$1 &䈨4V,fi1@q&זXݛgZ8:"t37>ZRhusNz ;0p9~4;7sT5y^>0)Ҳv 0fBz^>hbv0CTG}5P73vԍ1[d@VpE}FWݳ-<{$_]ΒҴ ߌaep$Kj#`ߟ E JxoUT'eY.돭\xD:cB *Awc%ǼGlaa׽ɮT`F~p9׮hݮ&eq*.Ƞ5P}lQ\,]ab28&hf_ݸ`[н&9/Vz\ݬKb\!w+\ -i.4-.mX/sI#{IOs4„j!\@{z% fKlRo.;O5Gnvyš`8`N3W+?M}9L/mCޑ$n+\HQ^Ul!fGJZhؤd$_/ЩսiO {3D\|?|d}̷UqC*_;̷с5^nA+,$\vU,.׸wd_I&Sn)3d)RtYR&4S'^Pm:j4jNm/ʆmW'a%;hQhȪ͚LDK-e}{fxV[A1fku)DVˁ;"'ƈA:Buo ִA/z}RiDWL &%vm"O6rUlNP*`6wHdl/XR)|pP=2#(Cr._|: j%[ tT R˗t3obDY{L q^0? 05Gz¥NЖuNYCJO)N*fa9*eq^mD{J]׵,8~^V"y~6RTVg3^&[fU<ȸjY̝'K5tO<{ג ϲLY>Q-mQEԸ wcոoVb *w}cݎBFcݎ*/:{^q5ƛ͵.Oe^l^1H̆͢EJ-j:S0Vn'kxUE/#.Tg#*e*eq%0oxHZۖLU5QsqsOuғ8%聚c9z 5:͕DgUS=̨L%c(M4 Nh $HAo(R,Ltm@L<=s, bgD0gCnuxnceM8Ar&n2 >Ƽs=4B N/vɣ?̭w2q&e`+f&ז;@ rVQ/RZ^ YsCa| c\|@+J'弆 QVi'ao9**$Ƨ:`kco>ObrZ^wmXal`wk+,J.d0?=72p)dhzkиv(E).&i dglE2`=},[>a& pm)"H2J#B`}5Sgy滍)} *r\9-F,`#\)_\s@vLV:)(k~y%: (h EY8a-+#.1$8@Hȶaŷ+v.)R-䎩{"RqYN`}q)E8&'&ZH?/2uVSN8ԼrN->qn %3V +^=yqϮ&}{Chjz~rDz/-LU+=q?64?xS%+=cxs鴿ܬc;x~dY8=g gFJ<*bD ']jPrE'3LCDX\(7й֙+T~f?폜fb?b26BE7&$L*)%\4 9NHEd١  b2Oe??N H1 [AOƏm݅ </j8  . IC#q*͌J#3(!$ q01p ׵UIM0bXI "aL) D"pFQBℒ fD}}kzO sBO5d)q0iRB h)AYM%xHEXF3琁@ȓhVsƄƀT6&D$1cFei0BMC#!#"DkP)Uxx?K.)q*dʑ `` IȘ<)n$`RaLjQnM]!3ƔALy,4ȆF1I$a8 C$d";423͈SϱR=OڬHgY$W^0Q <7}%cat\̮w:;;,T#QQ\Zz\߄֡T85&XCQ $KBHmRePpC2ɱD"rԺeK[1$Xa:3]4EΐK0f0hD>m ]0,g>sŤ/ ewhLrrXH;S L@maxLq\TPtlu|E%dF D*'0ϊd$!uDQF6ݘ`AV#k9:( ɢg=/2@i |Tf6W,z,UqfQL6B@Z9%(!.'G@6CA90Q8CMAC6[eAh `%eCdBE`@ׅ&RQ꼵5a@x"g#z4!wu@MHL8tR 3E6]" V42IW~6+9AdޔlD 1(J8E5n#XGC: !JQвA/u~2 qJ Rݕ(0j{TQAukIC.L"B[ @^YRF8h,٬!Dڷ2,xѵ?f؜o236=_nѮf1 .Ybr(NvebvH')} X=M퀭&/toz*(RG%K.0A1CZ$\|LxYKyR*}rLuU[(Hu*KZLtp1 -(v5l /,Z耸К<4E"D4iPyED}0l3V:,\qyd4/փ dKine!q *p4/=]UBNe~T}2hEvUb l3#/ˑ$  a~>yygQ'cf^"ǚ`Ad_]b=VyD@6Ky1j&! 3i]%@_[pȃ#D &)},C>jP6:=:Z2zƪvL >H:U)d&M`6ԨQy+JZ-7%Bσh*E4B5vi6EQIb&y,PttC@ĶÄFGDZ.'CQeǀp#PBЌDtIRTPk&y 4 3Uxf5Ԁ,*yQhm@ꭷ2x/8tF`!PFmw3$abq` ,}A&JP0Vo a3X zyNg]]v~>+յ3J"A+).n(!`3 JU_Y eWV꣨Zu\ќ{UFl' bWC97UfK5nHJ4𐗨!U.$TsCܣݨ\I? M&%\Y!eD;B R$]@V =>Vrѐ X2jիͰl+$OQ˅C<o2Υ }rpYolmJDDyee-PPQۢI͈TFbUhp`\c#d҆Ճ,X2ǀFem,2"sJށTFzP.sJ\d= I3:T@( Z] d&ZOiQ!,628J KEfFY1QB~Ժ  VO^!:f@jR#ބ$CL]hEG4D*.#t5ջ+1DAAsիnl : dG<7: }u^-g]9kw;4NM4Pj-S_uo>r v'?@sE9ه5[>#?`YTVxw|OzX\ߝra2Di?m8OJ瀸'4wk*Tybԑ܏٨SMǨr:Fzz>zN]/ͨ#uHٌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨsF-PædP|:F< hjWnf'u z; [PDVD:'2W7(Ml1Op DJ{`yO]oH`t?:w>9jGPD\6PՆO&DBo#XݝtPzK-cZl3.Xv4] XbuD*966Ќ XSחDZ:upX60[5`F6Y6YDJH%A;Sۃ5DΉO% ,㉀u) n ־e˹b^ Y> Yead'}Lꉝ n3kn*3Ta׺ܖ`[&8,TfVX =XC:p[=X%驄XX+S.'Yζ*XqN,iw`.؛\.r\RNI*h40f`˺Ahn;4jXoʫ^ %iwsw= mٸH+^`emG`t7oon-`GꂳiSqO#cyoƽer'[K@09Prڽx(6!v+UhIEDI`0`mDq2A!tC7zquma,in4krς[+k4CD$1yP[ 5Z(/Mxk} AHVV0띱V{HXDk43eaفq^nBmk68hڝ%ʣ T'-R*9 d3 ߟ&0?`&UtDW*##|+8(|Ԗzr%"/epzWw5=J6{Ln7T}Y.!ߟ̶Z+v u1z3kU,Vr_Ph+)ׁk]1`%!Ɓ(yX0*A8TL (X(|N@0ւ(H16DVNnz[}p~<ɭ_s.eQ\di#H'hXJCY ~IeX?3]ZPznif < >o0$ \ n$X5uڎsx=zl9cd"pEΜ26qNV:FmEã%<=E.09ȣy/r(tT`rF".9L-d$L3=Dǎ6"y"TM- {"û2\VIYmHKy{J8bxp]U.MՃDHK)4X"DV8m,hc.-)r%DnӝhjltBE)8) B 5FPI,Ze>r-!'d)?8>&hƩҦ޼µcxV`kQ%")\tNZrtG'?%5{ĽI;AP}LDCj-<_1P0Je N?^3:#uр mЗE'ab1(6r6 b ~-IL={t/ y8TjqpVjges Mpځ: ^f{(Obܚk9 ;A^+3FY]KJmshPt$zT2UKj^:Zѣ>:շzƌj!Rn! 9Fkh417v%6R9FۥъhN{Fs}9|TFN4R<۩%Zy9ew)ouqvNn{VkNfwu[NiP>_5ک5Z94"RZ1^jى5{I4[,.=)ROd^-TԵrkE dnȀNhI=G7-MqFnef'hEjqFnG%9\keoQ5"wijDꢳb=:yAK0^{գk^.t!\99 )=قw/ZO1!()E}0A6]P<ڻZ q}l}: Jg+V2OUR Ӏkvdpl"OT.ޠQbE=_hozCn3FNA!`eF_2sj,S'#ӝn򇱬Ŏ6.dc>>>m:s|I/|a&̽6h4Nw`G;PL{dv{Cmo~̣â'ZL9N ."e}0B3N%$0vNQ,tX>-+Ϡo;n nt/ofMD' t.aZ7WH!L:y"S5WT틂QI0$rA L}-JZ~sw-Xb[Z@2 =oQ /5|xߩ9e3yu9Jf}9c]NĚ-/L&;zbICN46a ,vB`:qtrE_E_ch޼<{(_^W89Wǿ/ݑ\BLX-Yo} LQ$Fٹ8 b\m94"$ADΘ% 3ۜxx})|=EOwyÅaY;ڸk$Sts>b+[8WTPd-P6(C6Cf<ڇ̲9;P_u'˽ ,OU vs;Ղ(C|`Aatg懟g;jfAC&(ҥ\^nc#9GhOX8ՇˍT'/qg)7:y>pؖ@ , 9e50bdFs9G|s~N19]*y+E. 3T`-Y(#E`2uZ1y9vGq>Eq1;^ܪ $&FkG1\!|EH )ab{scu#`&a6H*Is(3!^C@eyfYD.$Qi*ejW"W9]z}mUQ7G*>An'lZ@4$ ;d"ȖX+LT8a;* <ڮ/E|d:Ib % F ) !y~$H=aYn|7ȇYwtq ȹZcҙ vA\" r\)U.hQe'IR2q5m ʿ_s V5%aa5';E& qgdR#XQ= =D *"Yq\k ,$V|H{ x :Wn#|@B&<:x֜PRn(I9ہAӫ/m&5w9J&Y^EOʖgk,OjJM;xD9x_SRMKUIJ#Q=$RFH(wQ5BvsTx_FVP)B4*:4 5N.䴀Wm :Wj8.WbAJ"G 5ý%xUP&jtf &der ki 8U3`t< a5፩'[1aMTy^eS~s+_Hig* ^&j2a%Tny/F[ |uS eڣeqHkiR,˨© |A70x_+\ d T?NX6zӷuI.PQq}K`<^v9[? RD V[|EV^H -= @d+r*jr?@QbdsPpd$Po,:XZiVϓeoy®sTAs:xT< @+h@qccH6\%$*7ml' hrˎhc['ia1th܀ wz B.x8$]sʔ_6R]9*կp:n4r@:X-A955ip{%+3\G+./{1K_=P:"}zicY1nfm- :WT7 6{@rؤ7i x(-u$xAw, ՑYpp9׫Oo|xGzhGzX:~k*I>Tzg`GBhϳw2XufsY@.6K0x]+d%*L S5G8q&,9v'kvϝxm,} -E5%jH5 xW C2q'p}i8C(U nǎl >U&:`u2ߚ &I!賽Jg j멲d\z4 C<,jg@ ud^7v;(.ĩݘ@ Q^^#~ǁJ'8}3NG^# }72a/^*8E}7R_2FJ,M(ICMq yچB4E洑Ѣk@V7-YXP2tfu5K<^=r3[ua:Bn>D>kKc2[^X|ܢk].8X9W{g&q\`GRI$&pfIREPtR'L(d,FYne#orPxoWNޫHmGIqMkrL>eeÍ )8֡yȓ8tlѲ=ApPeGŵIy] xn^٣הXYp?lپOUڽ2*)t*wgvLRA˺8TL2FG@c"N x͆LɁQb M=DX~ҬqRuD@>`kVmlV AXy]xhsq]f{{^fj|wUx%>h4O'z¡ 7S$f%KӯxZճ@+}lՔ%g6kL !ǜ&F߬NdUP.A4j9ͨ1"f%m,Qeohh780^>yfb)޶{TkYBe4qL6҆ NTpԦ,daJVEi ZMf4j_֠gPk W4cO}3r]meV0QsSq9f rKT!CV5w(Y$ 8wlu(tA8SGFh{E4 RkUh>|܅&^3Ш&l 4ߡGS3\iݴjKPx=d<̓݁ ?Gž |&a hqc).Tl(Px2xrйVlq@zzdGj"s,W~>Ǡz<:IG"' >|,'5N@ 6@w??^B>62" &0CFt|x>K*X98o8I>Ag3l586gh@̀k3vMf6ə/*fѤ /[t)>" /E'U*x_1+Ug1u:"*TiRmr)7VvD ޒNjw:v ttENu%A*,7*K<ΖK_Jmf(Xy,u E\:-h "U P@1gˆׇhL%_s:=v3{s}XaSHPp)N7p&LWG 1ph^cA4G"o?a 08>^4*”ǘ A)S#r SxTacbfYSi:EM[>aRqd8?'zrásP!bzq#&#ǽDQ><׎@Uvw*-WH̋[c n1tgRRf(Bg?a9f5Wzk_,2nZ)DLv I`XN[;x|Vz?\ӧd9y[GW3Ki5ڰ.y_y{$0 vuGhaTt4E<7p %S}Ò ݓδ='y34&w&%} oRgd)Qh/<:$NCu {2:eF$#~ =R{ bne/(bT!}47lbA3W&T 9(1 v屻bm]@gߔt&[arv~/~gj],<+17 'A dUШMxw [QGx 79ʑ0?ȻYh}z1[VեY_Û(ʊ@8'e1jQ F֓>/|eN^_0wSg\hdHo?)R!DMSu\iţ=ʊ@-ϫ)J;G×Ooԫw̚&/Jt[I򞭆-[x7Y[,_ G HbC$ >hr/AÿN2g5ow!UH=Pu0o3u^c0-/RRhvO=2Jp<.ʭ@%`pu3lnŢ+B8-7stp :ᖵ@_ʥziOgK^6tDi1.Fs8uB֨-[\|Iy-&@aKm@іLd]yRV 䂶I.A1B+eƓpQQ qn[r"^7c..ͺ3.V%I)1b9ĭ}Dc3D9W6#R m܀0ͱ-qsrgA b%VQL k"q-j>}Kd6Ӄnޘ~!j9m 4Vo9bq+qQ#ƅ`0eCc}fW'h߰'l0>,xSw¾YM_ 9hP2 u`@eBJ.C-([ڔnkJ59`e7#S2Ca-5wӄw@͝\~Ds -~ÝPKc(gc->/b;xhNXwγ79ݠNw_F|e7]nb+_6c~x^< 36wϢB\$9$X'X_1#"eΗgL1~Ѭp H{Fn$ pa`KvoE&drn'&V,KԲ3bwaImQc.Tͪ"b;.(x .P`ƸPNK*<:{~a[i0xUq=b%RFPJUtwz?%sgǯoFp9FAE) J洐^n81s3,~{Og(PԠN`p A +%a܃qb0#.kI=^DD2HO<ԁq\{4ކ1O)$ u) *e;n ~bv2Y!lVr#m7$M LS ReQH:1;̭+J M/T VUzGS(x,ҬCuN`ʃcVY 9}1t[!΀bStqcAt!91{=| vĕ&P yl S-Ҁt1v2٦Nx +ïh=c |DFvDm Ty"#blJЇ. >m׉%*lw <9ωyŭז2':#D$*V ޏ# -xTBr^C,Ut¸k%5礵zז8Mڒ;,o mJ1JI!C+Ʈ5m^D [Z)XI1 K?83|ZG2(2+ Ĉ$`uda l)MVh)c%MYxroG 1JI3ы"D)K@SA`ǩ9B5Bp[lߊd6ЭDS$*ц~$I&UQb<'HE=RX &4p_QS^. <5TC~KoJO~0O'r:;FD}="rglч8g,(L}&~$j˥tBq3\x&)]g.hvʸkř|Imd,caJ[I=5y5}|}ԕאS,YK\^a7sۼ cK`6bqre`lRkkݢuQf^e,oKw7%#UxYWdR $OX߄5)`?O\[Iۼ8۽ wo"#XlT$`D7aZ썲YV~&bDTD|5_s[tOEq>ÖԮxsJM9'ee> %4NSR/֓)^ yIJXIUG0쯄IQFrȧ4MU~ɹ:OF6q\씖",Ki(qB^Oc#ə|J+Hd["t>k:o3tБ *+IJ17omT`sRtztSVx)㯄%s; fOWl~[(F'WOW{ KufR<*`'UƭJJO!e=T1!fw?}D\Mt# h& dKS"✰)W ~Ew*4T$mEZ,sVMl6t| J-eԒHlCyreS})g6Tc ڸxX@zKSʒ:E镜5tg: qx.-TGZ NGZk`yxT甦V)gr9%6糇 {r9nԄyiqN|twsopo?e=L2Mt|YpdiA`T7R$g2JIzq_ک *HbCJwB Π \_헂cݐ|~=:ċ6aڰMjũ| ?MDZP]2TGZ/%܂S{1cw p=d̕`[V[ 3g ҁ ~y L UrzNN漲ţko ݽqz U۹'{JN5oOF/OɌfRU+ ER6uQ,濴ײRov4T*]foFeU\,z%+)7l+\*n> C m}W9f1C*?K %AcWy=ҿCq s?g U}+`+;#ЎGqY0xGb~/4l]3d1NCT" s\_Φos6~͎a <=g݁ކE.VFּj~M _Z"X jɼvW`oh 1l}lҊu//0يTP\`g֟KDoIق>DNmZn'8I썀ч$gx㼨|x4orK'weʜw&Ho2y6L3flrs,:{~2zKCA(Sj}pEJIa>4 7V (?Pxe'`[G)]KE.L+Yװ h &R4xd$ʎO\;{?X"ew]ڶfm"E2v><v>>lcQW&$Ħ@QW?_~UtI兑j`9`u>,ubS˿qa)4qB˲F_Z+K(@tpuѹ}*RR0?J=Ztܮp4*yo?kgXaŻ1EVdrv5ފO"hr(`~ha'EΉz''9 M|RZ^UFtXZ>Wv]Jh.\0WO\n>⅛~{+8&=jB,qEːcԂ͏oh\/z3ѓoL֟!Ck5ݒe: N>MT_YqI@$\pV#P69[5v-'q\)nflI:PAQRHFp=ķەw9iIfS8t~rkV~6Nҽɢqzf 86ngr nKc*Vt@ 1SPR ~ٜ{CX{aH+ YMrݱG&X7 RVuˈH i*Txsf2/SHDRXE _p3_W먡s^t9]Q;2 VΪ߂}5؇XAu~?)3sL.~~1(ʅ*䞕É<x̓G!,& =[k7,$)Y.U�/Wf<PSVbK,D)6HE$Jā7#GjUVӷcŏ`udݹXr2!+x;(brkTziN8`θ-ed IA͐bp aGV$?"cUuޛeX˘;r95c8P=iK|*LY9Zl.҅$\ н EO lF W2 v]:9]<9(N:Y<w(ʲvνPs6(Ҕ}w>ilz28 186`<4r #jhYHY|5}af8wE`1BR'ARQq]MrHR8&*X-[,kP~st>xv(o觋}tYfu%h֡*)6}LYۛFc !%)62G,:*)#'CjP e=JsKaÕ3 ݮ}rVe tq7C:6 q0+w07`-L" מR*wV̯@6C|͐{4F9\fsSQpi*A Qj,к2ɸxT53c"`L*(tUZZkƹ_~+CP(/ +pE$_RVJipԳ˨;+3bm|ɘ xG9"m{)?x@ohit}s8l̑eʁT]ѲBBJC LKyDLZ0L@ƕxA6c_1N(/ن;ZMdFPcc A̴ePN(%;`mf ]f!?icX/scl]G )Y1k?],Jͷ鉳}kȯ܋b]~Rk#KYE k5/`#U?id|V5嚩)S1 &&53]4Ro :Cƈ݉J"Iueqaw׃]jXraR&2UeRrMQD8 1zLy+lv]<.1NmayUT8gȁ:]!2چk3Y? |P?m@3sU/[4=j1lkk]0Ծ/VoZBb{_{ujɋuH//ݯ]^}^'T]-D'o ڞ]͖ (+p;KXf!L~//Yw0/۠3dF15)$;*xʹB4k[ qdZO`Ś Xԝn*;+(1Iݧk=W9W< 9+Ϟ%sV%h=Gz;AΡF8%8:rߜ!_+~rD ďKl>_9W<QρM+szH۪Fta;;occ*_]S `˅ʏY;]7L* ^u߶_RiS7ktvЭQ;w'|p rXӕu[["*52ʋFƭFx@J{hRx,c{->cj55߾ ԙ0Dׄׄlj#l5й#׻$Kd/_N@Qa!Βv#/<8XoZ,/GwX/z{-B'؝Fix3(ݮ777#o5,a4XMOI9XZ?yP=vCEEM2.90?Y+M_$j趞K|]JFch:( Ea󸈣Ed#HXҢlMc9-pmq9:_|,Jځr`}ǻZSYlW@FcXJH/W l8fq+ʄRKZ$WYH샵3t9G<ģF8EgOf/?,US V,12%V"e4Rh bCq>QQY֜K,)>T5$2a% j,Z,}k[1_#ss\(rNтQr AɤN9*w2dSngU)Υh%l~ndU/>t6`tƛooER.6 9F-SE#·LL )mc\>'z|2n^$8~7DIWU55 FViN/he9$=emϫ CE Os$vmlW z:lDq>Y\!j`uRdHa>D A<ɟB=켠q8V ?~K,ѶfVuaJ_W'q}oG{䣽!lvy=}胜H*.q0aPSRbF/\MgԌk2q BFRzSS LUnaG$\Ky*" >8ЮTjI-c\yg* 6zY2s #AuJUך2|/G /_Nj0rf\$rk`O*{6d%!P73\.fyܶ;r)')Nf?cȌ%S R )cBŕrFҘr0Ę*tUڬͩo;j/l6EfN|i\~=(a7Q]+9(|: %㸙 OűBv< ֖cϙ$/>2uespq(JꆗƯ8;t{8!6JN z:nؠG}+`vdY90WJ** (`嶲 9#x[5B4MBP,ѐ҇5F ƞƦBN1dued+|XG!/Ff!i.bd]80c{du;O7"䓙+tvPBbX-J*^õ|||B0gvmgu4üqx/]Fh5!]F)d9n*n/%K„ @a.]74'\x=qĀYAw V}KiF³їF+geså_ 6eR &7abO~KÄ?ӣ!}-8d|J\jSUx3Gg_-IptNWoסNnS_r}THLkB<.hMĒ83&H J`#*DjvTǷx_^5]tWzz`{}aM=A ^ʄau\.Z<n4ZY& Ö\/ѿ=D^dĞ;_-L2#)GT+xd>(Jy%ti]5|7󦗲3`NCfQ6jZ  jWX7렵RIǪB[B0P7dt?pw3ȆF3czɭ_1  Ǽ_Kdz_6,"V[_Ek1gsmEѢҢ넏:‡`Tl4g6,ROZLڝCsZ+Z$# _W)˯ۢ>;EZman Y˷X:I;IXL]TmWarԚߐrk'h8p1-rdRڤs)Q{D$wO󫌳q//\} y0a=/}ԬI8ì(ycNh%h{*&c9 E`K%c>>`-,p8pf!;N0Dୁ޵<jassa$yxED'&:AzQq7KS{jꤽI)/"Z-ty: ̽xӷpa.Izt;ͧ->1N~z)#dt,9%6|1|=/O==2s,/漢W{\ī(= 'kdBh?g]|v3_l^pa_|{I{|Բ ƒR;mٶ|vH\Ӵ3z9zVu>, ;h26C=\c?pv+*ˆ7Կ Tb[JG+=CjArn9Jqa1w_N*:/IďESY+~olu J䚙 $Z8]U $X_C9Ns8n0 z~) GR|^tW>Q-&mSžVtTaQ'\SŘ) ߵyʎT ?eo S=ptqq@N7;%̋ +@yS Z-ۻ^ۆR,|hmuQ3d,ncc8 q8$X'XƣιM&GF`feL@8(F+Ǎu S!F&!ڛa;{AAop4 % A#J`4H! YG;WFE <gmCۯ)~. P*hZD[5&mQ | ה X 0=%MPp^,=.Lni\ڨ0AGZk)ՀPQoj`PAc"T mCJ(cX)sFY5b`8:n$k8XڂLFuA `4a7p!7Z: FjcG7`8hG#Mq4Jo6ѻFyF2~QB&8-n/D #m R*!8f墩}"ʴZ@FXnN5 GSKUq+2GBA<*m,@{C_u^vaY$< =axSq EZ K4?ë2XFp8u%A413pUbY\|kG=n"QFi8 1 ZW7QZ "FvʂNN !U^BGep+q)]P y@/Sn vuV{O5ڢ^Ay!=.)1z\^vb+r7Q8IKROϕ\zha:),@ϰDAX/'gXA=(|*dk VSǑԆ!}l4  * zHZ,I5 v0x7NH"5Or<*ChId0;eMUHSA[8lez\0@d*BܻHpGoa` 91R׽sDF͗=I%F2MY6pHȇG*FfF$jbDjA`1r9V¨|.śzeHo.`M 1ւM J(QʾA$d"\Rbw1 y5+8H`\On' \ G~x<*S฼8&i6m'7SXdI+L!J l/YZHu RbO5{Yi#9'*8c]편aLW)}2*ÖH 1FˈhnI`vj;ލ^BSeq_&pR>E$p2 RfVr zڈ*Za6.B3l^E I9xT;J&!% 삖[(<#O=3%J!MScɍs28P{xYC{ӒH_Mð ggvӯKB+ΐJ>Э BTsl慣2ѷF+T@qn\#7#\<0e_Ɨx}D!]n*744Wdeu *_|pt NF)Z9Ԕ.8@6Q?XE&+G 3b#ʡt+@ b1e![ưk8,ѵ%m>shb/pۊs .%mfq%P +Wbb28<8Mcf(IA:Hƾ¶] _ͺ9LXu8#28^6lAU1LFן.{H5*V0IV?8p&9?>v}ɯsp_7BC;m>qp50Ď?Z}~P8S8{7M?lӰ7vW'Gjf $8ߟs&^_+*)ۻvaz.F^~y['?͈#5+دIH~΅Ȯ-(q>&O/Gx /OXw6]|o_*"+" '_ǃmfvA,m׆ `o|w7{O6_UmWCw+B]*(Cj/E'جm Y*H $*ĖFӭHDm224`Z#+fP=`RE=,Bz55z{w8M+\jz>du7:*}9~ayb@A9Q\5IʷßW҃΃VOϚTnMf W@m'Kvrk >i`IDm"4ZV^e*HJ!;K8e(5)οW'Dt7nK95vR[X! 1"I ׂ,`L"<4jUM iswQmvZ%| 6#(.8/t87pL<=80:fd <(>˛xW~Qs0n1]aob̥UO"qE$v +``%l!cSaS/0seʝeMا%Vwk5Śg' H۔ ,hM@t0'q6P.CtV`MC)(}xQD:Lkb9c vr {b7qK(vpFhUl{}6 \\rp!~qjNrZD{;Q?s!P"h{[JoFT/pƠo g?}}OVz9ŸP8!̈́w!5QFD'tto]@<|z6׳sTn2AZHIg:,2qI[7R7ΐee⤁#*>&8Exj2W[G"2:}x੒5b)MMCQ],!1aRd_ d fJj}ZzU;0С? UBD@)Z2-x=;W]j!JKYyx"HHD Rr!4"VKAa!>oD`'xkXe4PVFI*:zZ׵Jx ӀcvϾQGw]P~:0U9fvVP++.ᶉ|@<}5T-2qȯ]Rd(b;ʁCJU**4M:QAXh>?MO]MSIaJ:BEYgU-.jB[&JPRܳ?$fZe誓Rմ[L LxY k)yͳ[^#̲ҩ%0%l 8MM.Jy `zAyV-%0Z&nd lJ%u1vNT*+ KSxPi<$mqJ҂l멌# rbh1̪T#}B훱BJê@3cN/K mۡ+I!JY!G35ԞJ(3QJ-Q˫%0 -\Jp=YPl沉xT{d:$^4PQG}D}RV&Q+-aCqS :0 bqH؄$IdcUIW֛\ZR LZ|a >4Mhl[)&#r  3FX H"L %kgSJ Z%lnNOo|_bŖii)KKF_?jFX!@g@4>T1#mOmAD\V,9,<++ V!|ͪn/1˯\6*j䊱Y>|P3Mi3dVThjiz2m=*uX*LUI8Q&TMH'PPԟ^B6OQ:תZ>M%!!PmLKBh)Mh)-K%P2Ո[𝽻j&&g>LBBg?%5]0Σ..qf-F0B=\IdC g7h N·B^9ؼ~.@ +4Dsߢ }_yj@֕:~Ȍf}l/O}cR֔+k7KE*ݢfQiQ ^d9@0|l ,7t|6hBbuecS8ZHl\rC{6s~^䇼C^W^1ko"`{9hD 3'TviR{D6 JHBX3E()%08aI {4QCG织Q Jpk3 Hh蚰?p"JqΫ?t|Abbh y7‘Ruä˵ݿW=^&?%/us6vyq}c{t DߞVQlkWţD9CG֢90Ȍnnx}Afb@=#g>gݜItڟL`{ hbE|v&z~>g ™-V8YEIIGևack|2aWTzw6`Je.\-~6~Q9\0Xh_:#3t;^^n{a}w^ ]n"/ǣΰ?{?Aouvn o#JUYv{ݣ_{owwah XL΄ 01 3`$W+l%8sڕ8ob"N=aZa&ZsWJ  N RÂHQHd"e(: q(p^3J}p~_^= n9Oj[ݣ&fc#+aKwv J?VVAuxN;?7+ (b,$ &Āhf1DD$.x$ܔ\{o_4 WTc< *c9Ü{^hBtHTj *%$ZjE Hi KI@U&^6H դA,猉lTI䈖5. ćH)8h+?؇vaOyPr|}".uuލ'Ѱs6{ؙłl@2- tߪ7a3,]Ξtfz8\O{*<ǹܼn;J|@5Biʙ|8ͼC]*~%׍m?z@O}a)pn/MAUs6& 1l3?f|;vn6uvh2L4|6@7|oaRW#w+Y Zwƶ-'TGt~؅C,;Ě] Mvޙ/f;* ##8^<V`4/7 /O<ўbCGG*>V~f .rȯ`t4LԩVm#SNv5C< FY6lIѺ`+zxhҸ W5F ]9`ux/˶jXGŁy4rVΊZ5 9h.2`'.GdS6a(: W/uL?5 /0ma<Oq00.Y C_4N`3#F1HWg_oxtaw4Gv7"7ΣXe#=ZghZ{Ɇ}lN&yEh-=tq0d|Hb0{``JFM}8[X1nC%BMP$nu-UL-\[nfJY#gn'aoK:0%]/0]j__]]t"x:3D>a~=8u& uč-llad '[i0ϱ%h]QlÓmS qu4W4U5Eʆu5[W\ssm(RQ%H9qn q\6qjqUCa'HϘϘϘ(1gمE59ɶ1Bv_J,]Db GDƺEF lEHR\A>tlHaH Rsy#})pf%Ѹ]Q;= %wȓi*3<%Od]0cW{njcBoHxt?6,"@qo:7- O&Ͻ/:<9i[g\]ĂbE-$bG_[e'v&P{@>8vբ9qM H?z{McmkҬF6%@v6\sPxC$o|u?oUM^}ʿSr*2O ,?H p=D/7wcy':>m[7ևΩuؾDɰN~nNl|.Wu߾#)?sM$|F! ƃƣJ8$bN 7I!RsmRmR5ۤv8G>Ȯj (' w؇g7Mn` ~[$3{"R]קs#ϮZWm,zq]IqCM &qӾ ^]^)mXvӟ^?_P;w-Kjk17E؟<!`yS5ocVM0rp@(ϩU CPD`kuBi IݫFOD:5q-B'm>iƏ1kOS9.*긌 GDFatGT]",t2 W Df@}D +FVJeOPHg=Vٖ*EU~HKj'ٙŏ_P$77@_;*o5SC*;4?&wA|Je%~ӆ˗%B619|`iVߥy!Ȝ'$7p%yڑ ,Az\DR~+]XdF>RLŪvdiA=TH~GJz5'`(-umPy#mP©=mFO.2xѴM{IƈT5.JZ]GQs6(٫AAB&֖xGRk@r~x wnrFM).lz'/#Y_h*̞@(1&wkY)J<zjgOڋ/}cZG&!!C2J Ԗn%V5)Z*5Emj)iڢˊm'a]حkmQm]M 뺂 }I ͣȾuv#.h<|{0&C7V+HfaXo[$sVy7_H{ D/oK~{\$!*yMՃ!gqC Sl;ƥ}# xCߓ\_EGr^MJ.\~e&,GfD/x0+JL쒜6Nƶσ8zQ &+9ËJ6pk`lM}UUykjj[g,{ !,\ Xz_[ 4a4ǴRIT磞 B(gâ"(Rw'6;?,NʼnEqbЭl;5Av=ؘv*6kDb Ӿj_ o `!b8{pLen}; Hex%䭎}F}xKc"-<~7^o3>ߛq{r= 8^y%o]R դیd:~pp4ff3tR̽gۘy8kzJ8,*ߴh р$²teA?3N!f'U"LLPG$O/wg~FEN2).&2) 4nj`fzrF..}4AGO3(x>nx{vׯ>yޞm{o4El\_|}g]Ɵi>nyFzjك'AE lmO>El;+=FX$Qv;=;ӾfbKlR$^S ; )3?cuK,QzwݴL+>="lmAony~e3Pz^ $?y"oE_GpdGWx{Wns%9As۳/D2`}yG7uRjAF=h(ْmM~ֽ-RYqG`C{ "ѣ-:/-߃2@??B7| Am{ez~Ͽ*pE{ 9>rn98gaILAYv}_KvhЖGjr Of'Т ʢA>.XP4Gv2/R кD;B_2)zCjX'Yu>TQQWV( ТZFYMv7uم{?s <@Y{#=kgk_u3|wXoo;N4=+h+zjuG]l9O3Gٺglq:9 /yv&~cfN=,cǓ_WS 'O՗0"q U0:!K ) ,㍜^s848؜)),6GjӔ{m8+:E}0v#"]?5Q0c!>F:#JڟnS$[]]چJJ$aL"aP(!FSnHHDվKZ/&57Yj+Ɉ&v A;Rv=!.rʓ(O돈;{DQ_5flsdR.3M$hlL¨sᵂd-3՘Ҙlt%Ɩ@ag*ͼ՚k1*a:!8,?иl5{ CwCQwŠ6Dɭ Z% NF„I&adB+ UDB*)3J TrNܙ)trn6l^i& b@Gto<`ze!4c>>Ou0si_U9s[p-s>y4t \W) `RЄŰd4JSs-K^[jW6eףjܦAN;NoU:8=*\DvyV>5ܗvZP4 u,?ݞAA,ݵIv)5zW0w:f&q3bX$1R$0%&ֲrm*A*e$4`4bɥ-R~{ D\O/BY+g_Ns?COW\p#4iYکbߘ BtɷG_QfZ}gMzq!IDBꄶ%B?ceCm#HА+fTckM; U(suVYVy;cRIb #-}^8"#Ir"ÐQ +ᵻr8uI|!7/ĔzXNe`SҶi qF4Q@HTqjHZ "*,i'UȀ3 a-Sqc1 rW+]>.S\}v^WfX¤&BZ8Ɣ .:N)QC$Io8 .[;Uۂ{߷Cv7Z#o[ hs'p|1_pA-QfQ~¢AKf)uXħ V6XqhlL¨%\Yɬ}˥fR'^3^^6Ю"VczjMp]^$K{R$)Y;[M17g7tX)5LJ%0;父'Š͆}DPyjWW+SW+iÂ"L y0uE"L]SW+ $q]v1]0uE"L]SW+a0uE"̷ DŽѨS+̵\W+a0uEgYƭ&j6I&L۹̀h3I-I9HJ,zP3Q$82ta0]FV< f⦇.Bht) ].BbKХXR,tevev):R,t) lbKXc]:N8KХXR,ܗwB;GI^ggiCF;g "&5*7SU2߹ "^X8b ,%ÀSo: pyij?h8Tw"&BC,!|af[a*I^h@{drGhx:?Mt$ bJ@ xCԏ {% w?<*@mL^$8J+Xho(4)C45V\\ ǀ'>z(%& BNK8hX(=I?%56ve#]RlE+`MU~ll;Qz!Ȭ TJzB \Az |DF-D[f:th{q`k q2GHVZLsDXtV- \[4V[z'"5&0ޡȒ 4Z>^9{F Me6Cf nOwo*@&zzZ% 5mfvE`תx;ZE oZs}^wW;Cݸyو;æαiY M-‶ϭ 8ek}oX=yߒl,j;]6.;eS)vl]6.ⷼCmsW]P^})* WJv:,J/QUzgGQ,Z uf[msSଏ/aqs ~_>{dPk~cA@Fl ӽP PF&8ޠ4xz}eYh AheSUuZ^TrPԆMϴWS |d>2ZW;'.EQPU$/e:@U zidޔ q'!jD8jv޻|wQqo[~-џCpy3XZ-Rd΁fj^&L6u+b{zpk#|ۺ|j1ü߼gjIy2/2HN>߽]'G8P-ڌa2 |%>wC'+)<7ᢕ%%eϩ]O`kn79_rornՇy{T4kUn!igo?? "*d=nPICG`}ؖ}#V „S?%/|dߖOQW_U]q{&EؙdXH+,}{`4(81paNEÈ F;A & e̾ciZDK6٧e{K=-bsB3dE{)Ed QKPX0Rb ,F:vuѷ/{K 98,.w-E瀥'_0Mz,c_9ak+IL%+FU[&f_y?Άlņ5O񚵉E0'Bq(ȰgcD_Y`Kja,|F,!ƑO/;7J|UD'PxtR4B9*ӁD0=9rn*%iVJ2D!po6|v5 8\dlNf";ަ]d,;7w!R+AY z)9S2X`ŭւ77SŒ.J}eXmw3;Whz.Wf)0SL:F˳i IP;ZZ*m9o*jdKڶ yt (r\tRT$ݍ|[]вX(R<چN8ږ0gm ̻f9Ol9Uhzay5ɡBTЧSUfϑԋ:}[Kt-W$~^8ܼM>&駤9\j~(F[L +pʺWÖ*m@іږ*Zl >γ?~[TcĪEmB-g-nNf ܩ7Kic$`|Τ5g-yUENyBiԩd@@ &BLt`Q /ŒD2;(&fY-K/6Y>|Wr~ɐ9:՗\_̜[4ެvLzw8̸ \?2i2`KI^ggi]HסL20E6Ry Aem.5@:Uǭ%}<*̒Rl~TFشas! - x0wo5olmN8}Wt'庀ә;Zqo57ҼIA#k~z݉g62h#&:`&Z[;3]ِJ)7d=g RgՁޗ%SisO1_ϣ+<=X좷 .vwq$?q|I|d;h[h뀓ᖀ d2 R $@҄eN &]ܟϊPqsETUg|-7 %'h9)l7iaSʉ|d>2l|POL,t?,q A#;IRW{"FtnT)KCSPBvD8I@ ×[|^,UJ9u/}RR>DHaXhGUA[ӭP,X&*#As 1w +洍Y̑(t]'ƫ`J 1`Q8]B<%VN.:fRqӀ{+ Jefc<)qt\P)APs\ {өE:P twaHk9g:GGٰQg0 uk]~ċӣ,\UoӖ+`~u va7gѦ탠~/㬆\+r!f}szTVPe=L6m`}1zD>z_٘(v,D[wwcMޔi6c9T3M_ <73|iCz-M:j3E ė!b?#\~wۤw?$/S <R/gq1-{y矋x./Ykv8CO~Y)a2YE'<Еy|Qd\_AP[N8cnuP`c^C󸪑#~b@Z><L5Uf,q7pw|Ȳ<)X&L:1m{u&*TWg=#%.İ:=(J\Lt~9_X\ EiW*..4רs.a\2Y|Dw>_F@Q 8) ضS>NWiZToXЏ M? I7OsSIA)O}7!l/~ʀ)Ϯ^F_oO ?(kä fD?prM<=IUQeFL}g4vI'߁Q~ʣvzQa~U94G rq\F\]ũ*m5<ꐗ>\9)#= WEuMd~rN=K|V t1+JwfWYܸjvwvvvfvHqV?m1h'Ibi}AމAޫ̀-EvLybՌe\f# Vm~ ~*=Tg@ހ zV{IYS@F}Vz+pVp}u~xGy<ǿ~% Hu&y{/}>El }AE HH% jD\KÔ̠$o٢V$n['l $\s8( 6z3:SR`R4LD }) @*(Hd?j_b q7M[?Ok?Z]՜.o[Vwr;au Sr1!vt۠ (vp!$1\7nC3|(ck]kcVMmb-b5L6LWcJ6$6ʪ2A}›y-# 6&%IfȁᐷbВMj)L4W _TU3P ja+j@1T3H:u'꺦b;n#!x,I y{F5 C^h f*/H cvdl=#p8N7ۖnT(dIEH^`6IsB^uDG`0)KmH rQfo\ p9Eq8eJHeJDXIFR7 Ih!%Np vLC #BDPMT7Sak,N[>90,7@\]p<_?xlռIn95e%Zܥ1~dm؅r!$:Dt nh6[Rrm$`4-RYi]\#4b3 Rrі=Q?GxQ"_+8 6R 5Gq=KAkc,S$xZɽ7O6SV^#{ i'GEFb1,iSU =h8$o.{Vcrh06>0޲մ cz2s@X4է֖2t:X2Ý~EA3^TRgYna۬ DfL$Am4eX?Ywh8C7PZSƈÃeh] 1-- 褙ot-7"ONYԔ%Z.F|Ig Xz'9Ǩ+YK,6c;&KLHA I32ҥ(ߛV]z!CZoiQ*l^7v5A\oYW6\|$.+uޛ>f“,wDĿDGL+4 BKiQU=qL J㺊Bw|R-[ +Pu54V^ʯ>J8*x 鯒,c'E Fg] a5ڴ(S-* 9YYf3a~Ⱦk 证W?= Pl4\zvV X(K.x8k(Eǹx%;aSa"a3ՇgmBzmLH(;Jݦg*?%r8I}t0mQteyrG ߐy9W/7͛MM&3 ۱-ue$SǏE̕ ˼Z1ҷ98vC=_TS*r^|>֍Ef9q-ꪪD%T7C]'qC^R6ַtxA+$wd2S3a!mrh@RQV*ϋF粯~8;}GW>U2qmP&fhΐ4n.IGp6ot'a!ؽY_!5=d|'WQ{xvvv*>|:)ƶս"Βv {[53h04L-ӫ<ђJK uU[]ghLxr,6?݀%se'CϪ|azx#P@uO7\+pTth+O@qKrM\m5=W/ mkm?EU4UMM.g kZo⫞:㇀cٲ=ԖV P1ǿ1$"CxJ7OjGgUO{ R-0Gi9?'9~<S\K@Am-GkXI#/xJ|HD$ڭDw^iX'!=1G`f(~`C H%,oJ,_|\%qp31y7,6.<5?gb 7HL%yWJn|3,KF|!>I9:Z }2,+4- `v ɍ-אFeUg'eVץsے\ׄUF)|֍bz= 2%9:t)ofpb^Si[eܴ@ Mh>ΒG/MԊE;?tqu 0%GAT]{x ۅ`pv¨@x0bz {G0;ym.s0}oϒ0_vc=!&.`uh6qY4L]) WRSRTlUeꋃ'Brj*7ZgG^_C9aN*,)@a9u9XÕx3Ͷo7ɫW177ܔ ue#A,f.jy.(>揲mYu-U6;=زeQ5XԛsyoOanzni4sMl]M}U-j:n! oy蝒9-R@ڒ< he034hPF G(8ȤIȨT\EQtYmucb e1|R?9o#KKo,T=fHcMl6LNgM iq< @Jb;kAd05[],VVX.qR]b@թU^Gp qfZNc?`8W/:DYsWMGD#6W8v10ic7J[| 4 I3jxӋZ]̗)"hS ]Mq-,P ]gSO ok>h03% RFp-> QxاMן E|- R~ ?Wؓn/SՓrd]tȖh6|?l_eB;;h}a:wm C;k a{}!nLjNf0.!s6¸3kM \hfɊBYJ.-!D\s{'0&t+җ[]ZWܔ74> FO1[3Q+:'I\9Hzл`4O̕i*sN\2o"ަbWNM k<8VP=\B. IwF/өK`2M@'<+W enȺ{ ځ Ѥ@ ވv 5"%[W \LHUqTW /p\ťJPmh\7'C \SjsLXP&Nv'͍riG nm?xa kqt2H_Q[PnAh Y,߸5@6Q eӼ脋lbRAr0޵m+E{:"ŧ87-Рops>׻jm7)ߡv׶޵ #Q9oLʐR#HJEIfke~?n2}B~ݣF~G[Ǎ. i̥Ƃ-gU@\(\HЀB2lo[1cIY&Cu0|^gyyx]a"ؾ~AILo&ԍgt 1(Խ=M/1_=9+@}r}zX/ϵ)SJ-E gU9b'ΤCJ0XEf2VK^#W]̏ב`"[>עH[Li5#*9ߺ|\h}vRuFIf(pÂi<Q*dQ1#!ϋ VXg_fJT0%͘u ePQ:2f4dNRb2ȂaP7QRvMZzf?+qcyBZ*-,|Ě'TΑr ǍbUIxh p<5)E fcү*5A"G <$0+(5=xpK/kBf?{4+83jccR&hi {Ͽ[^J2sJwVϵ^C-BE.)7g\wL햮+K{ [IZyѡk=-'b2?a_" klZQzs@QY1)ÁL͢Ch%un7wyW;r3?L w709lg%:qSw%'V.tCʾ mdLc)@Yy|j9DZ.hsd]HK/c S)g4*p3'( Q3VR="ZL Lm<ۅ,ZKZ/i=Egbbx7m"muFۂ^$C@H)dU\P 1R 8O:Mw] Rm;?1 ^QFۆSb9OG4 -.F1wƷӑQqNC@z䑠 ̢6D(cg!H<:EuʳdDIfB6cDM%į4< zYG/oiъk|XVȝ]yx+XАq>XJ0 46X̨w6#ASm'>Rb9rn -rhіۖ<[ToGۯZ=fόpU *yΕ9Eдy*-[TL?ddIc%cK_Mu'k樽NӺUc^?)ofArA!ĊG Ȫ3o+Ư|w(95V^}DdcsræQ߽d6Hf/`>EϺẑ  |a5^^6L-,DSmP2]>vLWֻ(12^޹^m*H= BRTIeSW)'C7@}W­Z~*󱸽aD9YyHր' b&}e SO#]Î32V gx6A*5]_~|ۡ+@pl㋒πE.ڟ;'.zw$OG0ZE2Q3d2)W}#Z檅w2=fLx Ng>!dɎ3G䇙WR^ ħ B*2'+."D3g#j\8S/S#OF?n9pYR2לm>(v0?2u};aT~웊d\@k*E +rbh;۹7=>kCC#zNPTʨO?'t/HGE؅Ey'&Xh2p42N4L怫 T"ˈJŸ_'W) cX&hWCo2|IHi}Y8i[OzZnvg]}lű;q`ͯ5&9K>~55]q`!X T$!6ŒU< A%mͺ`٨~ڦFџNVU`+}cpNM2(#RbhZOU& Geu0$GRЅ jڪg^v?e)VV(pԖW?ޖ7sK6P?OkZO:yϒQ>哷$?պ1SIx?T .uc3LR)y)Qo7 |vdbJCVj˛ {@lKvFuW+bZ{3.t<^Z3"L0)7S<%]oI##9~c 4v>XSJ-E'fCJ0D3}h 9](`4͏w@&M>Zᑝn1F3!a?CVzrѬ3чt3cAs*f:2y( (G32'J1gRRd0Xu{=dNkKb7yݮCbAN5=kBbMT`,AŪ1'ly!jR4Q)p Ƥ_UׁEyұ'.J|?k9 DHw8QRIHS"`@XU3\;;yXz2g:R,5æ4d_.5ϴXC[ ϕJpU-}ʤo麲w=}[ r­qtZOhfӥOa׆6NK֨laAjRsV.vk=wQyf~n/8űAlg%:quJr%{<ąY/nfzu͝o;Qإ"e[_ﮆxNtѥs~`81:s+5)S h7.ogoǓ[BRa+NN'ew'pr@LT]~ Lㆋ7lPM%]8 x[x[l_gvrV:'o*/.\ԒHUkZ.{mKXyF7S*ΒR#JϪ;&e R7 H:t\d H9lșRQ,l4ג>D񾇇X͙|lOc ԬP+ڀLER6F0+Q=|liMvLO):ȎJN$m7}?]/E q3#v̜n%L֮c2KHi`dr$E:i?׃ȉ': wnϖJR(ωBMNP`ш* !"B0?aᲈEcO *Gs~iw׷"N'ݻMmf[GJ+`[ZgIwT5{HTnw[jo 홽z%$3롗2WK}x7qw?]{ɠѸ ;͜O /)Ҡ3l;)%߳Z^|3S/IG?4Ѣw^rޗq}Se؛{/qt_v\VnԬd+s$t5\0-S0ՙa,ѵmKRsfכ{C=P_$ G\JS ޣEe߆,-/Ydna|]s6]>OfmEO`Vp-?ޖ[:nvr'v6?֘f-wP볠xJIyO)B(ގNkO*~a:hg: p$EKVSE>{1wo_o乤?Rnfv33HmS">~-F; 9JLlцeM\i?,> ?.|l[^cɃLFU\$ؐ/2'k`iQdoez_v ?.g)9鄺7c9GO5,Rǔ V8 bPaRh9O()JQ<]@ɭܪ J.wzC)FyopGDg~Di 8Lx(,#JWYv4*I]q,*BzʪQ)]t?T+]_W-Wo.}t":Q'b9J)i2\F2!.6}utlEVNY5iMUkbu=ΗmhQJ+5\Lb[7<.iZ˻?立 lNgDoS[ QPldZzy49[l尛rw^Be{|Ll;Ƙ֬r) cw? grC -Bd!.N?Baaʖ |h3D{+9x] /\[sb TtfRTɻУJ5kI҃exm5h)6ZsnYMpWJ}w8q.\) żBgEJe#)p8(l}FA+ﺑzHli#]$^W Eףܟi,db)YLF,&@z _SX0$nϻIaN|Rg[6 a3tqR<;7;k53 OgݦXpQ_q~z{Ƨ0X67'7pN'db`Uvi ע'Ypڡ}V/}/NU=yϜsrYOï.?ر&xgx\|X~:1ҭWiNCJ4WaXAzJvm{`\C/El^O(wz:&뷫C OuOg!BҲbZR}Y 3Q*=\̧{ì*z*Jk~hS{0z&9Jb='2_,\ǙEx…N_1YW%z}}QEdQixi_X]NnN8矼5#PTx!7KdcRݨx7^HK Ӎs"Df3|sZ%/zDž'"6{=q8t+{];Ui,m"ZĖYKDؗ&ÜaƲM c,<%-]߱OȪeXѝb-v&}Rz <׶ryoNcԝocC'7zC'#5urN~˪dˀau!&7Bz@g*@Ha^UŦe6we/ݣ|))fE8IJ (z)W_}q8:/[$\x Y&U%<Ӑ}c7Haw7>d)7CK۔2~XBs aCf*Y![O|h7Hau7X$=82`4s䂢+xzȚ:JjOK\,.K| =GTݍ qP/zhłіy P0ABiH' Pd…EG  =ByOh S%VK+T,h9&X"zҌ|Dlh& B rLE,TcfV$=`3E7Ha\6^eL@@?( f[$0Cu6^r2ϞRK-!Ioxewn/!`(&EXM5U0,gg0ABUzb0y\K<ՒI!7Ha쓐ޫ(4!Jl\#-z/Ug7,yN x%Oe$%v kx4˜4JMH6s\jHa"{ ; KaB(]NϷHa6B6Ysz'u䶆bS>-zO1mozKJel ̈SL(#2`|RYuư@2QeDZ$0);/k5gP;xMXN0 -hve qDgv<%I]XVmɟ9ZZ=9<3tzDpE iRWt&J}pUTn3+bDpUaWU]+Wpe[.\RUU׎&8iܜ:\U-':K:P#\YChWUK3-#\9 ĮHadzvUՕaWU[JSx\WzZ)\Ux WU[iNJLpup%C#ӣd3 qpUH?tkYܷ) ŒH]Xj+N]U-LpupNiGWQ8j v3+jDpUY,Wmɳ+r2WW c NH] ƞ:\U- %\Ǝ @t+RW]Umɳ+RpϺ^jWV0g5m"ig.|lV-&$SdZYrՃ_7 & $a;uz.'"9gލs{YjMȖDz,ə۷㜚:]uPo!t[;u裂+ӣnc2_C?骤MʧM Bo7wNfu`蝎vfwxkC2X~ܚ7^+]ܼAC}a 50w%r~NQ8Aw{EzPl}`UwXvoG(EyBL E >]~acvG by]#ρJzɲoKXmѪ\ o͡`6들;α%eB%ilMk,]i I"6[k}6Iѣ4^GG^75e'iCo160Pay^ۻsMr'YM =gEjU}dLj#m`5\~GS624-j^…2EB;>\UZuV^TcM9[ZExJqқ]EE@1WQ=٤RxִZp!jZ*7"QUε`04Iۨw{u6; ۛ\VggWTSԻfBJMU6,@Rym&5!zIFL"1])'f :#M1E{) wLx ͫ^oe_:q˲zY !sP6l9Y҈&UB%%s*T0hkUV{l~%B!}tBb@QDȢ}.mToYRX;Ib3VB\pA64ka7W3\i. +ae2,yc֡i6 BVPB wMA[]#h3wUʔ8/b ~NcyN.t[ Ҡbf2b@eCd5p&KX+ԣ$57!:i*V2P 8%;Jh&J +`WV"w\ՠ7wVrEŸ~KX P`PDdEQHc7uӝ1(QdFt[sVtX,ꈛ5&\ζciƗI!0ά :Ŵr@vΤc@8"+LX wf+%ZM=nX7zhl No&%(e+mGv2Z"L.URfPRP1L#r 9 WB¿&4n8L{OBDjNc`VTc{6 {xWZE54 d36>/G뮭y1#.Ef̺hN 1ƄULl^vO((_ >ìr 9/ jiøqX[ywkW e] lIWI|8%PQFyYiV2Mt%C(hbpL [্j,(tF\8g@Z PM"LjyUA>xmB0ⴱc4@^BB%>y/d:BG-83ϊdP? "bygE3ʣ ld-ŀb$Sƪ`~8>l!>掼 蓥d}E %XVGS 0"(hg.O9{^s@ ɏD.c`jYwk)f*H¦Ex(%g6ܖS5Y$aNJ@AK^Bkڝ2h ΊmB[v#hXp3PIȚB\zzw> `&#E4Q]  VQ;#2Ϊ ת ¬0,,H1#dlDh! c#(jS,КϽecY= w64IxFDi3+j[ u/GѠJw a:J؀"`>WAK0A_ ڪmryXq=`tY/U<״j#fFU n=\lJ''Q40z63 7#M(٢bq׶(5(U/ Κ` 0ʲ¦3_Z}/$O30% 7aFE,SkzEFPrmpqHr淎׈ .zul)ZKC$`PrAw£j7KRpIF itb鹛WKB$ti_@m#з+zgjp!GT ^E 9kqj.׆%W I2ЗKCkZ\RÒ g^|kNч-RLҳ} ,^|bRl-V\]/itzJ+jb{K׫cXvYnB'ЛNssſdXI/puR5.a\,~x1"4'quiy1 }xSuon9x|yf@K'U5ߺT=+ן h V$'0`Z],޴{K^=^ /8nY4Wp^|fEt߆?URp\#,.0l#p'n\F3N_ xg_:CљUnmpD< Vm^]P[ ,N7)W.o"~V }P\A zhALx߭v~Ws`Vߤ>%^S|b[rܚN՟N/-`?M-1W:^_lNiX[ 4|{S;86Ψ?j۷.;XD㱰h2Zj95W౾{aa*hm^$ 6?3暚$s݋ٛZ$tg859.*?! ˇUpwT[=N>v1hRt@Oe&фPNZvGûU5υ@pYGWS'o׶#Ó/00Aܰ=ΰyGnr_/[/c|Ç[}wâ DT#:_Ňh8ő=`d(wY=`*qw!|;b0Ȣya,e] gcYe`20~lUgŵ:F=f Њ,2~?Fk>GkHeY;Wݥ3~Fy`mȃx ޛC_9~igc࢒X5>d[Q)+#ZbӟpV7svV7A}^gG;ӓ𣵷3ʥU\(Z鳛_9H%B! E)%  4" fW~0{LphIKSaI'μ:.DpO*&b\f}4Rۍݍx|GNa*yO%eZ,kՎG{ {}OC> ԋU<!3{SwYy]]EkzĤw;[1CV%ڏ+q;>|U($ ux >lyNF{"+F{haYaI[dUʬ*YއIYQ 1p:eP*g;I1G0GTS&|CZnrgN_>{mˀulY鯅ܛ٨x[$DHޤ>'x`)?DdYKn=L2a$oõy[s1eRd<-bK}4ƳO )OO|i'Vb4)d`k,63\ 1X@sLȳSyiP?cg D6I+g+oٹ?19ώ$aVڙ,IXe"}*܇Q҄!Ct\噥)ʸt,'F`aܗ6q>n?ωJTv.KifMj̘$OL&OFTȔ޼kws lRn۵&;_@yxy;]I&|lCB/}C^l8P߫7WUOfeiwi[E+&ͼtLKJ{U\=|$ .K;.l?L YV%Dl^}Uەa:(}>q?n+-e=WԳkɵmdz^nϋn-ώSDz{O|6x.-U7S7h,TU p0QE:ܑ LGRb^`TQʨ74Y59̨$UHvbbUkH$lKvbN|kj8oLmn;. 7mn 3yB7,>|;~/ZYݶ֦ӎeAlnC=Iĭ˲ M>)ٿ1c#$"k㹢˜![RҒPNbXtlhԴiQӎiAkG vWs=v=ڏ9J~ğ$&#`HkGqBL#(E 4F:wR,<p=(psWvYGDi!d 1Yfs,2L goLLJlw)kF[k{ѿT{8AX;Y]އx>>^A_VO0?Ta6/Wrws 7QoRo=/QA޹ٜǏS{QNM 7>.h{Xqa:y؜jW= $8 $i5&W^J"\] \ =$q׃K+Ξ]y)%pup%AHH 8v BB+/W^JM"\] \).q8ʋKi(pKh +]PK \yqu0 KIDWЋDpNwAWn yZMZNWH)|f{Wbz,) VW N+/-W^J#\] \ɚWDrG}W -=9pEWWTjJT@p$cЋKp(pɹÕRWWL)BbW ^pK+ȹÕRWW\q-I@pŕ`8cI0p奥K%• V! 07A/:{)iK+, [WJW .$2~p*•Ғ%!^`+'<pRȮ>\ɎC/]$pvV%'rdMJqf rzp.p@p W^\C+/-gW^J#\] \DI)+X  \yq @ڣ?vg0$8ӤAaY:c՚1L8⧖r Lӓ7uT)2[5 W l9@7k42jx[PӠ><߮<<|hE劦I'R"f@d9!MY˜ 03`|Aed ^鵛^-JP&f/NL\Heej%) &hr<}鯒\AQ(aXX:j\COٽ23Wݣ=*JTgu29M912\ XqA= SUdɏ_'W_ͲSn e |?"iݫ@oGaZm\iWsVymTɯdiOq_4yh6.<'[{ ]Z=7 pPNqu-LxE]ئjZ1jֲ6O">klZS͗Ѫ*Z=R|UߗnϾX&ӧWj';AOƈ/#F ^|=V }lz ~l.ud-Kim7u޻ɵ{3ʙmMﭩP>{0z]gE_fmyhir챻Vw,{[g箷i;{E7o)M3˭iF[C侚ODۚuZbY-L`t4#(XF&Ad#R~dG&eJ>Wy:Fm8<B!?kW+)Mj)_=ma+/ږbK.Wˆizҕ dV`54Zla;OԺm/%?&kl˫&ܲ4CyBa-'WY|`.2d<ޝJj߃`t{˜|QyYi5mv>f7Zw`վڃ{6>^bHTs+վFU{d6纍\\;ovx)N۶MWkk# N-o דp&nf/N]Ի!rm}˽.v쟇H3hlr=YAב.5 r]庭l3smD0_p m [$M.'" I6kk>6E]䍛|hlSNwi'WP;@Qi؆d3 ?PTDhpr9WקWe`=Wq^ێB* ۓҍl2na60Z_HG',{mt] {!lнUa"/LǠa^KޖvEu}#k4ظѽ冤Yzw'6lTvwޝP7X.hrXHuD)66Eb\ibIQs{ *mSI%puP=WVDhpry4 ޾tw\JC{\}.b9d C-W- {\T)]1Ws{\mzT8  be&\Zy\ʎEW=^W 1 _:PLu\2=WZïsNDW(WG+!BOWX"•`0 P. ժ'OWC@6\\P;51JqiWCO2rWOQcĕF疂u<EP-]HjGD+jGZvW6xpe< Xxpr_t=pjUq*q6t==%0qd`sh\{w۩@ju,-pE{\mz`;F+kWG+\X6hY^!UsWiWeym]vp[vyf[vNV&{'w{6]Ɣb`d4)656Ԋ^{U ٧Gbse( \FM,BtWRWG+8JWp (#"\ZWzZ2 0xR 6*!y S-!5:YU햞 ہ9_%t0>Cl1/λZtV8ާj@j_4Lٕ(XhruæU!lH|pF=\~߶vŕqJ WTڎ኷q)CkF2\CJ4W/j ZF ڽ]7ZP~(ekM]7\~V0.cYg%{6'Z9N-cZS޻3M [IQQؠVU~DSlΩd6"\`U/:f'\ZAT)x#ĕBKD++X,BJvWRWG+ɍT1EW XEr P]TJQ_B3!>uoW92`npH'A9|:*z?>|Qk<&u~ |jN)2hzwB'o'df91oeA 瞱BKK"$EYq눑.cK)vz 09ކIj<([yeR[gFUV413B[Mϲx OBn(2q/qQh8}1"Wᮜ2֕ƓgLX1ZB4H)SQXf4A(gF1뢐D"g>'\es;ߜ煱 +UQ?Zyu s^js @9v1**7ʍsvqFK0: y5+:w0?qmO5F\O]6 e ?Ip$/կ_..~qtoStyufW\K'J΢\W-A f8\ُoy3g!}E0JpNQkOO?܌FMȇ/q?MZ4,m꟫Uuv< a5 u6Q릥:^r\N_?IKїh8b(3)͠DEgCVֲ/lsO9TF)N'"ۏ﯋ċGB(/T\:fp&c&!?OAw!:| x+ nxI@ R}U8OނBuy~vF>%[-\1L 'S?2 ~I( ۏw("H2 U~QI™z43&LtOhYԵix+69n ,7@ς ~󙵹"&0*2:e&SSǫ%Ve3[odoXu:Toޯ_ա'ZG[l;>o[)Qܶ@hQ{A\qt"No|*ɩ+dZ;ĚͱdewͻCv.Ơ=cC[K!X]Uz<P>o.j:f{lן2mֆ`cZ ufxh[0n<~19{Wt7./a~ڜy #B\Q-R&[Kȩ&$AH7u8M+@]<%XTy[p{U`^hôІ 'p^hJ5!%-{vB\XU(FٻFWIଝ@^B-9j9xCDJJ3>UOUQJj6 k'H&"]$Q1ʀ4ҌoBH8b*j*B3I|-4kBL"A:䈵.9s!/X0FRqdAY̒/>1cl[Ĉ|T~6Y3nr40ÑL-X_6 e"r]yi烟 j:5JPL:`?_=44oJmwޏW'7ܜ29}Eϐ@$AD >³K AĞ7ftV&xsG ߒCi`AB0XLX(:ݨgU)|H(-ˋE~\:"]PN4k?r#)M󿺿r'|8?Ǯ<3۩Ub%񬸊G],T,~n=>QoizCU*8{8!]NFhxyUі%mW E?K;+=^wPFDnچaVcDǓrLTL'dGgh0X)f;:']u6Wv8_}ۋ~ay/.>mh9nl-_?~vvLMJt'<5`xzHvIfX lE9nJ)I7BdOmvᅷ;X^1vHho4m`Fٽ!/.f^zJt*栔Yuqg޷Rg SF;m(j"j1kMxy.G.`R֑f9& I{՗D{vSH?qg3Z9 Mw|;jO"FeZƣ5'H9 \xgqDu@P^tԂPiiϗP\B~~;}&[5k,ev2;X$^EվIo-=vl +Lw(n[}6ܢ`JDTMFOٗMQ>0J<2@B.g70k"G]O&HȽ+w+9o$h9*|A%VH9!Res..Ĩw)6)?ûaǙ NmHHC<4.AXK ˽yy+{H}X-5Jv[m~J~ElQP]TKTc-Yc <(-%2`pʨd^q2WT,s)(;'!Z$}t@ "ΈRR2 > j Ik9)+qϳ$2? Vmu83ێ(Wkm'ADdj;v*l9:Ni+ՐsmSI01=v 8nTFpRQj +.%!\͹\&J|ޟ_-e+yPPD(E`"Z1Å7/oZq zhx%n}a{}17 MvoNtQ{O]OOb$i$wNk_S==1kYiĿ6$݃&uTww9.thfal bKPsknwA;=dGݦ嚻|[[{:7pgZY l4߰%TA[yIo/w7#~:ZQ z`nWĕPX׮ܡ+CE՗iEtR P(0T g`7Y?>p8nr9/uB9+q*pD ldH/]M|^4ey7s}exFWIUMt!7~:Hv_r^t#"Q8AM}-}a į=MFbr gvc0i_ \XN N`DJ0U* @{|ʾ! mdFDSD#jċ(LC˱xT-jsF}HkN,coHƔDrA#@8AqHdz9KkS /*ՇGeI˒vHI;i ;Ptu^K$ _qq^+eJI5%*P'I#p 3}Bք]p0 ^>00!L[w00fRR!h2E)) b$;uOf0)uFk2܎-)-'+؋PiMF*'x5-?B S&L.*uNNQqZxƁ3bwb|9&N-;XnbZ <]t125ST޹2Ôze7=&-"PP4ar:t)= giL{4K.swL2"?Z۫Hc,Km%GkV.AdΠ7M./^a,Ep0Wn06 cÅe=uY nG"*yn7hcqNڟiq|'(0+{Y,7G)7, ?s^'}˟UN]K%$ZX3JuN͵?;mY2Hf,Yg mW,/,KP~0tӫc'&#o<4'B>ɞN>$u]uu޼_zYx Vq]{W\n#Eϩ!i!LS=e7}6 3"Mnx)՛D-TCfEg&uaؤណǑXѼ/TZ9$@'}=}%E"8Nx{2`}r6JÞ-xjfrv!B Blf`P[式4Dh) @i|!K JA8o) hMy^qc%њH8i/krJG6Hy43f8 ) Un_#"φ芫E@O|;r%1~04*’yl;] \2zw6:mi3 jn6ϕ F{cέZl ٥/YR3Unum|D&9@wwz(oQ`E}au:\#"G._$ۙSJ-tz'X<_ T%VJqTzBAaK~j"a{=-a F2-j@V}Q{1vvlG<Dx2`zT$JSZ.5dhB 3*" c#̐m@V:yX?K:=)-ai&8GCVu>t@ "ΈRR2 > j I/QLk8Y+jfz ɶ :mGbAq;ϫ_q*mP)XAUpp*K,Ч^{YJ#qVS"6F+3\{KS9ugn;l,׊gZhwpO4DQ'{LHu}wz2}ḕ{#@6&sjU4a;_,N#!=MHj =F u \i]vylG\J]׼UT}{~_Ϻ=׵A WV]Úkh:Ifs~T_'Y"]Hy(E!]To)>H (CJm-pmՈ?Y1?b~k^\s%Jʃ\$v:F=T_`SBF"x!NaΉ ,PIF@Jt01J,wFY-]|}'9qv>r:үF f ctaԆ֋Mj4?{;D )RaP[IWq쯪&|]| ahO+?oŠ|DYWBۙtG/Hf+ #i.>ɾ6x85>%-/3Sꩧ"%A-0.$d8YaEyW|vZg+^  F kz,}*8iUZ]mOOe'ϻeùExYF:ޖ_-+~ZyW$?}٬̍{&`\b)4z&+#SMڵA˓ 1NGsQw~e $sfr~1d]OڴUk9啒2OȐ3$cnsJv"]}6 b0Zt̖49KϪΤ)=0lBx޺ki3:KIK-ь?W++hXF|ڢ,$S{+%R =uc]JS$5sV6 )͍ %ڴk 'H `pXGHt4aa7Fs:l10BB2 3uS/ɐN Y$53.4ac5㣝uY>-hpQW+FW[aTry _ 1;εVHY td:ƹ Tjbac72B!ўf%fxX! Q#mBZb1(3 ^)2ND(._9J "y9Z2 Ew9lVJH?B"7ybqngz BozKV7vei+$e8OYuNWEyWUXWZ!&erM1 P!AgkW B.R|44~)o5ׂT <aaUņqlH"cO\}콘/nP7Ӗqy/_7S ufYnߍёOڲ8 5kBM*w(QbյtqK3sEn&Ձ ufhv:v2FX!fX`:L!0: AHEot"bo1Di]z*!nvoO:[VK19䢼f'<ԗf#mp8E6uA0aLGi1M}*c \ Q vsx5Z`(3*h2:?Q9b0XJFNX2 [pcny7`ǩ@qK\k}uے6eY}\~?-֍~RƓupL~־2C.(c9Wa5FIC4%,즳CHfM3u{^Ǚ10wvO+L }SzҞUmˏr8mߝen0;|ˑug5v 3G6Pg7ty1ըޫ \mBɥ6M{vw!o%.8ӔC[!V6OOׯ@R V*S}Sɤabk">lh?G\>ufpf/Z$ KoiD FA^L|8_:zhwٛl{9k[%h{k]@r89?#=~, X'8k8V>-*3׻uo??Lo~{}ݏ0Qguwu=0 IexHh^ruSMC{b;4Ho]u.&_ݛgnܷDإ9W$f9,+On/h9덳YmBKbsܥ/ Dϒ>)Hny9#>gʇ1f S<4lbpFF_nbfM52K@AF=iR;{1wPg Efx QL&pX]Pأ#3| SzH#$3:m:O`9. f 1A[d>X$:(cQnIoߺHc:20v.T$ϻc{TԁӜπצCtJtPTWCmF_ G 3ݹ#"᧠ S3q礞Rx4[3\bo҆Z]V9r;b."}_ؐ.5}l%Oٌ3 Oew'lN`{ԺO_~ >5!ɾia`wVǛ಑Yܭ-֓/Y\|9c+u'zY |-e;jLP/n C=.~?rL[8ҹw5Iw>? D^rÜ9Q [cIbʤY5Q㺞7jlM5RB)7j<[֑aYqGkZ )$c|e|Qݨ_mmݪPnjݍhݭܺ>t1Uһ%;I_| z!L2g/+%/#4ƞ_pt .!(w(:?= f >lvU{&`\b)4z&+G,epzӼZ܋ 1NGsQ8u/iWwO/to!(` 2 D?פ &uD9B$ 16>iDcDDwnT] Z6[cR;ĬW逰  %pV5pխ+#yp$\#\$-$#6WI\MW -*IIdWW?}z5s*( IK5^hJ T/(qo(uAnd}EK:إ*Brd8Z^u65qoi}zEa\[pʸTrsd%iɍ*a%8]k ÊIח U%Xo}zžc9pgdKmZ^z*jy@}P.HRnU}nTiͻv]+lğ'oS1P}dC<1ظF03bBQo8ba5ـ- 2X8l(SJ}wz_דJ$I[8>aRO?j{yo× vi,hg(&ՆS⥢Ya96Swe;nYW?NU싀~406E !),}N$UL֖Z$2Iތ9qOx;}p#h!*HB,d& Zhb"e)ąbY6!=_~C 6B^7NS?i3Z[&$xpE=Q,ɘ&s FpY2%/(̠Z'iC9-U{J:(E3AD(`22Ii7U ƊI v/eB6i,Bo,:1RrS\%9_R =;<ÙWNL)2-}Ž ]mI%$IdxA JUsw`s1&Qeȥ8[/6z.2:XQ;*=c[oH &1 iD{#UXYHt|YIK[7"Y/Ȓ>m\'CC4RlR61ys^*q(e깦՚#1B)Ig6+;VA+ O'pר#5x3&uozq N8: @ XփZja1NkP s.126@"KN\Ps FPHiX|/wvLc뜶SY2IpWO +抩˔NTdhUl`Õ1CW\:H'E|Rs<+f@57.G)f 40[A ƎqIc#f0!vpĠTc"eSI1δr*ٰ_e ǭ`I p2P po]MpF%JEpFY,</:^ ~"RDbkL]ZJx5{TP1R6ʔ &y\!1*`( < BD bLc`"@S[2,pd-jI`Gd1N x3V; -eo(F%<櫌1 E%3ťE8q"^dsia;3l?xKU=Ʃ;.GE|eI=B&(q`<" 0N(yi80J` s] Nz&L..H$+􀁩p1;`յ7Pj,(t_@((|"H&jZV"!@T. E,h*%k8nJxKT@rH|( lrԠDK4]&T*d@y!U@%IO p'r1쌱oܪWɰ$>,؊"PD Wx>I‘$r 4J&xyƴ2apaBO((cx)U$*IkzY\:\cG\ "ey􌜌j)"725H 5j*Y.H>9.>3!˃>d:USEce u~5(Wr΋`jt BeO%s%F56R FxX@a33#-+A-t1E zȕAB~)Q#2`U=FMfPO鐣,6 C1J Kn LacvNQ}6DE˥W`ebBu"JI-bJ$"0@ES@$Ti@5UFէ+z3!z cLU7B4/{qb>8), >\\(E ?o9rz믇'2 O;Dg& ʴ9ZhwN7Uݛra9;i9hzp]nuox<RӬz_L6+ɜ>Nn"xu=[_>7iZu9bww_Hs 〟~Seۺ%MOTg8Ymw6KNW[;I;%3A3ǿ`5wEY ]]QeNJT/CbF7^69* 5{ߙOnΧ܌dfVpDπ@խg)dd]tqU:_|WDESܓfYHY~,`|'ͬ |JIjbDP*rV]rNye)HoSs0J\s6VᄍVߚZC'6fz+}o|ѺW?~jI yzK0Tcq (E go7[H9v?a6c{2.o:>Ǔv 0_O>c<>~?dC[&wr.SNws܏Z_^w|ύG;>j:)p\mϖ|<:ttZ\r_"MzA 4q]]>kb29t] z>/v7s|q~85ؾ=ъ&Hgѿ4p` &^Qe : Id1Oc]zT^^>M89Jc׾ibv}EUï ?.k_:HՁ\H%s/%.f}Yʊ`*0ݎzHi7rqUn%泍`*vVD^ؖ-mzX ܾ_KGzI#'5^ENpC~8CP?E#ڛkzo8Y'kdM5q&Nɚ8Y'kdM5q&Nɚ8Y'kdM5q&Nɚ8Y'kdM5q&Nɚ8Y'kdM5q&Nɚ8Y'kdM5qVL *؁w/93}Hs1}o_uaF˺fs}9B?z:{N8';EPB*fѭۏ>Ɲ<~1Ҷ^'tTzq;;*-jӭїt`~žeį7}iкE0bR=.߽xTwŻ}1t_'3n64Sz@_闠Q'iKƿ@a((yzkzf}2u,;<cō'~j2+`Y]+ B]:PPWU%nIy"~Km{~BsﷴtX'tIb:_l?x yHl 6c;i960X_w,_ n<6^O9aKJD5e.\`%p&'v|kd76F?{7~~Fg5YFcs~.>LSҊ_$Z޽O__ߺkslb`Z1\KŜuE 4&c,$- ZTj26 ,sz.{h<}Ksڠ݃-X^otRH"nN Dr]"'WxࡶVY}rc*J<5oZ^$04b9neN̳L ՆrUduȁ4UHHh}JDɗMq~^wϹ^x%zC'z:˛akPWDzدUQ%!LlcHE!ѿk0ɫ?^rٽ/u_yڛSI^On %6 f3.xx[r^ՊӋI|}o31Gc]&f4<dRDo|]ܼޏ.c&s89}}"َF/mD/_^_)(b=[H ?۷_pm(K%o>Cs}=9Փaul^_n_!8m{cy?{ȍ/!;n` j+#jgq)vK$[ݶ%f*V}5>;o疙۹]sR~7>mI1wb.nCooX~A 6: f0bɧf9=yyW|Ȯv>kvZg> ߧ3AQ>`{OޑѴ%cmTt*aMv6#?<?ǯ#ㇿ}w_î'y֭k ?z?>n;LJqk3ԭ~[35C>b;{W|c[bW]oƵ7gطf{:*d֤+0llh\fT1F/BB4`.6˃l-6&x9: 44fedXѿ?@s vw?v9qGami+<6"`пto0:]8+ \IH[b3C2q-xne!JM" ƒ+Fs{G51yHkqH0,DDCAwwݶ`Βͫ?<O轐`3儹dI4!:Q&Z+XNr5!TW\}# >-#t2p$A<=Zh[;"Q kÇkfR:$1z_❼KRx oq-.Ņ[\x oq-.Ņ[\x oq-.Ņ[\x oq-.Ņ[\x oq-.Ņ[\x oq-.Ņ/oM>bԧ[ dx3O;zUx"o1x [|q5PΤU'I"k‭PTdC e=@'b_z\0>r\\̛#zp+>'H|*KS=DDR |?< cHC;c=:-sj㑝lG/rx=|vX]_Y #f]1ܟ飚nƍjܸ䧗7ruZ[%iuzz9oYŵ;z?&Q]Yldw׹O~À>etgy Z#h<ntVNWmF]l6BK-bm=_鴧~Ɲg>d̓n9ݷ_q`7Ϡ64?[N 5ˋTٜw=B|eCIvw mfՓz!/vrfA 6&狉f}ޛXANgތu;0fd`Xai cFa(10fƌ˜Q3 cFa(10fƌ˜Q3 cFa(10fƌ˜Q3 cFa(10fƌ˜Q3 cFa(10fƌ˜Q3 cFa(10fƌa̸N 诣/~r$ZZ"Q6,~Ԥ<,,x嫾*G} A7k1R~ e?0V2*;C/&?0|(52fl~a^^.~7HT?_}׫l-?k#u6I&uk5J ru x\ZRQ`Q)V4`p)̙M93/30:ޭIN|9 zaSmI0TD!VD `R4>B0Vc̣B-R2S;D*~~$kbGB͇`d*{%yBj8-r 6G#\=6uK!F^{Y<)CDˬi=.ƕ] g1,6[+T45HMj4v;ûhKfefMo/էjZYկzlB5'.3V e֒!lVgM3<;,|HYȝf靵nvŠb}kş ÜU0"Wj>»\%`q(Hg8 G^k 4BPk)VasO `nB5 'zo=W j̳}Q{6vz҂xv#1C de:KӀ#Ժ0,|g@"l( 'c9u*y^\X'N(FGDU @έ!9x )EHъP`0+nP㤯^?ahŴvv#1ńjY&41ZL,E+!4A&Eo;3ƦJ#zPalum MARm66`9=F•̛"Z1e<)h  _x8#NAC um{cMWm0I~o>0 (G<\uIO(n|Q UG[Qʁeû 0'Ԓ'[kC J)hb?.g1 UB[n)2 (W~zq#x6kT8(hߧux#.}:J*X=\>AUpCT xX}HׁҵQ~~=3wx`1pݓ6@Jɸzl5KF 3(Y/0|.ȃ&k(72Wze{=PvW$aIHeӂp@İYT L!b/iKkABQY;+2dl\h"iEҞRNZž@(Vtjlת: Q_p\wΕ\{t֐ 6 707<ݓ.l|{i5 ꘠E{Oת3hE9nncZ87=7ZnNj6]{ 6\o^׽y蚡.|@j~H?>ͅmX O5I Si$g>*>L97IY@gܰcr:-ZE:{C﬑"FƭFxI\Ҟ-Q e\Hہ%x8l:k'+_xane?+M l\g7mko Xx[*yW$!CBB'G)Ib Xxj쒢\ˈDuɇ)MJl%N wD-36WrTz!$KM :#E|"RKE0DEp`II˭4$KsG_"_\.ql_BΈ}!u_ӳ!\6e>0ղ ɝpR#:[ħaKĩD2zy\7a+)XdUħh:vuGMQWoR])S~ hż?fr ^B UB[B0J^ZO,SWs WX`Ér&hsD(+!ְyGͥQ 7;av[Sr k\p9dJ4$`FDhFZcTAbH ) a.M)Mcg A(e )$SI0D&$S#FkuN?I D6yhCCHalQ^/՟֗ >G}7 2>ߣ`.PX@CƳYA@A 䑃^[ڣ2,_uTVmWXl6^FS_pN^.]hc"Q'o3Hh,DX &g F8eIS@Dm$&3e]Ԕ{ F:"A(Q$wAo!GQ 5XN~`Sv9M+hVt%~ (͉҂{g{)z:^?RA5pRBr.N!1BA*$h1A,!2_WV<:x4ڋM̲6@wM'md2<?@Y0 m9GzI%wEMP 0&Qn"72Wze=PvW$aIHeӂpճ@B^Hׂօ$cv D*2Vdɘc8(8#&IEX:Y\p^%Q,jRU{ 2-V$HSJIKŞhmQFEN©{ÿ|]" vGH~.z$C(ˌE4_Zp_l~FQ7[I.T|~N@H˴?{6r¿ξ⚻~+  }ƺȒO$Nqw^(6e[1qp93ٙ%]M'7]yM]y=&{m6JweL.KIԴlYw WX:gX8:˫ SKztQ7ZRۆ=@hDڶL׍w78tZi4FS65g@W[ǑURQ7E]4C홄zX^@Q,I;4fo:cŚ2Au=TG{&n;6a7{+Mt3Ǿ0l9,1 LDC_E_OJoQԸq؟8i7|/:a%Z+z{"G/hZJ~܅!W nWyA};HqrRrhy[ a:)i$Hg"BăKօwa#yݳ-A8es{B7םPe+ A5slZ`XXk>[{ȃ~ff;35כZ BσnhZhed'+׃|}9N-(@<\$9Z(UyUr0^$h-90.XKFD[ ^:VS f{`rڰFN66T%Ĝw'fhsN9wkm`MRk%$4¼ SMR7,Ę@aV*;kd4Hc,qjk:EKl%hǶe ng}r<'!nq`+r);zJWlWtzpe_)A~'˪HbB JZ1 @"FmP%99=|i?A."b"Ŏp\Ogݜ<{u1mշ`eX5p`7eO7DŰY4|Y[l4,72OD!鐫u/f*=(zHц`& u҇FSqRYJg?~.&㎑8kj JXDTk9( ~:㦳j'k5&`52!eLXό'!JʘĉLXl@9;]90ׯ YdJz 3?NG>6o*+l?~}]rg7|ը),.w9lIHN S1^0I`P*cT.y6V-2_u0OuEv gy *jJSŽJJny`!ъ-BLd P\ҭ(< WA~k=]a݉dQfkp[N+|Y1r$71ڱr;ni'A`RzajRtޥLżĐHT"9 :AjYܒ W|Tu5bNi9s < 5nB\ve8)_M(&)jSna|a}IV\fwJm %y'\GCqŘ,qvDcLx]x<%x~fjYG8:1XByt\K.YU,iIn 9x(&|B<-KQ`Ņ-I=K0/f+5 r} q `9Աߩ 9c"3 FߧVG (UrWw[UKw/՝WepvI1_snxV-3]r L''/8M#U[kGbHMat01Q0yJ>/g=^;M6!RyK6Jb5uq&#y`箧c_ʏEURS <񯪳ޠ]~O/|S??%T~UO0nQ ߙ_F] 6m ͆f04mu=Ọ+|q`A >o_{c\_̎'X5"Aϫ ޸u%UJ1ϩT!0@_xl'#/6FiM<\qShRq:eY6g`(Oi؟wxK[u,k̈́W9.َ~/3m9[` &!End$+@xne!JM" ƒ댺3*Ec̴#(q`z/$L`9a.Y*' Pe|Z#h]n֛t:o%6f!Dܺ{Zϋ;R+f47ّx]xtnũ-OF+og9gϚm5h}_/j4俬`dvzC\2* 5*E}ʴacr6xYD:dhtHYg% t ,MWER4a^Ξ[=V+[ɦˉ k^j@5bCIYǗZ:cR uTxBaaœby smv-M0[=l]B|ڕ6#!>N dʊyBj8-r ET8WYs;MlN]693']btmlY,lL/遲o[4(a?y!$sdto/%) dkMmnNq J' ف4|X^φC<ѕ-a?N ֫Kp'υA'#RZX" q r@@:ϩSQw{~-+ Qr% PRj,pn ̙CNX X  j?v@j^V_Z< dhcb-VEb+ސʜ=P [j\#-CQ$Toшq@6 ll:!3Je$BGe4JDE95, Db0N`M`SMFvs$L= thRtϙ'޺R-mLfa .$B[O8x!sr f&P| \r( )ld }}|ۙax\_8#(a7Q;䬣Yz錎 s'mM˄  p#8'ɹ{ZDS*0Ksэ{.ca96EcE.vH!nr }vpDZFRVl\:LQEWȼv]!y2 w̟$uh]O+5ѐ/G!OP Pʠ@狣Nyn-\ye)|.Vf~z>5D-|VQYҩ{XÌ< "Y*|9,|TYBB^F_EqApoJt25>G5iy6[dbqeDQ+{oK/u:?+f>nQ#3h0t P|SU%"+$Tpcq9M^V=&[V?S E= QM"!V(zT_yqG1׼R7B:_<ޅ=Mdqp.Lb]&hJ$#X>8憱J@8٩SW򦕒7 UU{lnvMBrZ8EjItP`"(K^`b8t(e*KriNXoR2(3ҵA @MA4?{Ƒ 62R% b cr(9Uσ/qHJ4 bG3ꚮ_UWW9S!J[],n~H%:a:#2%uR1籧n| AhbkeAzLy־Q |q}6TȬ`|j!|k.G"Lz*0Izʣ FIk·oYn{q-VG[.4 }ɑNcSDNbNM+;;'N@gջo%P0 VVtb 6IV vwqUeQ{,fE^kzצP$oys6^1?ydZu+}sY,60>Ab00ul޻5pVXxw3g+j0g\?8_\z9G,7N;÷ o#]5 CQ/O > G&.|4ՌgcuӸqT֏ZOiƹJQI5|1MY>SǎKz5 X,;,\~?|>~?c`p`\:n"'x  0nkh*qQ[͸)7{_|L- ھQ͟C?w]]iwi^:ž`Jz{ <=WuQPe.T+_q@/ϛƾX[y4H)ʝd_4](_Qg Efi} `CqQ V+&$1 A(0mg=wY!^P|,GBQu rBL(dVh#".IFEo X"m;j*'.}롎Ăcm|\B=hEUUY)|v'KNQ+0`2#Ԯ=lcL8@njY2C4;j1&]]w$; W&;2fAHRVa0LwZ1hFs+%:Kw^ȫ '`ȗ8|z4dYxEhw`+_e؍tľ]'Ev.#2Kw[|侄ffn^Ȁ\Lpo}zh]?v7ַYt-3hUunױ^ϳƛ;l\a<~ޑ76ktveW7t\Ք\Kܹs^㳜.zum} rW6ϙ~ᯯWqj-?Q%׭|utnc{L,c0tX&p TQ͍Qm𜥔Rm՚ȗ󁯠*Zmga0$> mdbRcg6(Y8S۾*s 78 Ō93˭ĉuXTy^T bL` IUp;ov*@ƌƁZ 1b"ijZI}_q/5]46g Ula[[4֑gRZA ZSZ,# f@4t `?xfu7 #GPG8|.!Q#mBZb1(4(&xmq1 :qʢ&Y=9 4t=}|Ft20R}g?^# ;L`Y>j6;/a| J̠xXb|\q#5 K} F{nC "$VyFY,u;US6$"jxT> ̂HB9U J˽)ײ)9Eq~djGڴi/3V5Y{h-JB))& IGm7 ^`A Ɯ2_Iזҵjrjʗ[rWEɯf~ 3렟O:P: (LxN0+D{##a:0- 1mC,v$a iU3b A|@N@׆T# Ҙ6$N^i@ QzHLdu A9i0I̥6d-IZ'iRҎZv -f> e\wE,Dٽ*/yyy $a{'7Ġp \UdIpR>)=|4~WݓB 1%= *2pmDP-)ݎwY*9/ a+}E@<&"V/OHgjc%vXX K7aʇ(0sNnRnd =!"=AZ3E4^J@䆀X,-,xGI4 GZJ {{' s\%}pܢ"upc8\MT`ͤ fiYe;\!`v?\; m= V',3P'԰ݽvVmmL`]=s{"[mqIFUIN[rwqeo[.sW⫿ׅ8?Ý5a^j?^P wu˥w*Ͱ2L'"fBߙͶgrڰW2ֆפu>]0}!'D~pǘu(P"A̫bK8)NX'i~6a0r1oè%{}Vx@"l1Ո#RW`F]cQWZ]]%*=6z50")="uEՌJru,*Q+^]%*5+TWI vDGhP8tudSWP]1fG\J䊣QWZu CWQ]q.>&uŹDxU"cQWZ~*Q)d^ 2;hKҴ((8~~ ?=ANy̰8cH2#3!`#=ͅ=fXZc0C  eQ1K+m?S1X),]ba<-OwCht=eB)Zg[?'clN~/kồfbY$1>O[~qeQjq~1K X̴Q^. dV@S.)Ҫ9tw +Hv+ f[]}_I~V;Cd!^QaE:*4^,@iK9وlc1ts?>2pZ)ܗ^?Ԯ'e9bxUc!M/8?FʑIſWw].'+:jzޙ0r柊[d}̏c6J`J0aљlY ׿>*' /bߠٿZh9PV1F 6P+WN,I װ!=Zݲf*GH4ߝ-GaA* 9t`go_%NNnM?B3P:'9[!g]9"B1sC(BJOXZӴuQ7/.m[Q?z|Epvw2NC\دAefoE@j߾dA>8{{n㦑h#]4 C,{`JF]̚|8_6wћ[!R~"Fm\XG-U8<ps_J_W͎)][r5JugA9>黷?.?wO էoz{7u;0Q~ տ='Lz=rvhy ƮB%bpp CtXL: V !T+PMCz뾎hb1t5.P&l$[=(070d.9Qw NwMWS-6yy/Z|jtDu&f|۹c1/ ڶo7 {I>K#HDf[쐳"rqY5j}u6?bcm!PdqjЁFE);g=_]uH8A_m$P`NreF1P!l<\uر%= \IHo.l]:#q2 񘻠 6KsQq ygLXԤ|soy?Wp?0\* ՞ Abc0?4yB<[oB07{Y64e7:6^3]X.86j^!FıJqQFuLw  7np \A,Y Lwy93۟oo=aA6H}<=Nmހ,9'?;"q801e/Vl8Qr X[:l""s<hnoBpcH`d|I3][z#w72D:RXk,6,e;JaF'*\30SprX۔V)Η!1BA*D*$h1lA,!÷Uqזܵ5v4njR^~Th W^9d@]@a@1s2K$`{MDnd1mȀ[{':-qXRY 6(S{0SKt-0!|`]H"`<*k;:xhTj",(NSf$`EI{+f>߈o\uq.99lL⅘pPX`3.bO\`~0(Rh؇MBho%k&qXd 2) 8"]"+Ȋ"NfhaN6S0Þ[#[lI 8 ʼ>bxRм˝.t6CIhIB?Z3\sD9ޓcP|j7Vg:޻ࣃ{Wsx9XB| .օD Jh<,oꤤD##EU,I*IL<PauKz`Ŝ_4n)}8^pp9PW~PN}}~' D+}\ń 8/Qƨ8<hXO` ֱvltB_ǍhC"eC%4D//0]ɡxɳ칗xɳd+Kh&_y bnj|L nF^ps( !OF" ED.9:[wLO~0^$h-90.XKL6h+ {dJS,Z=ݙkisv١5.j˯TZ_&u qsFZ4KHā3X*,Xܰcr:"m!whXb#VG#$.iOuVK2.$mt6sJqNם7n,u…Wup]93-J=땓uB-߮LhtRIGW IBNR,ǝ ٙ \MH]R4`.]073U t8ǫ@ÁōO Y0Yk⻯`}TL?PTo3]z$†@ĝXQ ⨱SVqKIQ sWeoQW¬>uR;$Ak%5Q1eگ?luaA^:~2&oʠc"ϹP$(G'_a 8ֿ>ܫz!;^y1FUB^MQa>bYrI&_<]"s1ş1?-.~3(UXp^g?E J "GŻ:`IY}vj0Ӥ θS&J̈Ưz3Y@g޾sR1qniH{!Xw05:NG` 5Y&UxB?MϊR'h 7WXYoi3|_&zi(ͩ .fM.eGX}flzxՇ0s,M›ěCHH|Beo: 4;89̝mdbw{+Q Zz1#x< /׸~E;a Ls|5넡xlψ⁑0p,/2ޖ{v"+4[ԾxNMXJL v<}_QNm\l66iEPpS 5:6n7j;6~;{^# LTUn0ǡZ3PÎ.n ;Js:N$I_wP$EuJQt;p҃m #t/vr4a[.%C>[<YNQ{J'EbXo=WJRi䣝JcX4V*JcX4V*u,ZTk&iIMSTk&՚IfRTk&՚IfRYYdaj$2ILfRTk&՚Ibl5s ɔy6 )l<2ϦMN^)lʠ2Ϧ̳ oޤ̳)lJp2eMgb̳)l<2Ϧ̳)l<2Ϧ̳)l<2Ϧ̳)l<2Ϧ̳)lqHgSٔy6JgSއ>}GbtTˎ~ n<8ٚF?8*>?&Ϗf|<ׄ\]M@Spb7[Jw;~Y08 &DijY.LugZ{tRy.}[ƒD2"%w0nݶb&m{]ϛ *iCMu_?/9|Vu~Ryxy:54ռ.r2V^yUze ڶKj#Ykkyv4fWI;Ջ#IVK"?G/ҼY@9\v!0pzȉR4gKrTO+L5w F:`N1A$XcacB)7j<*8Z 1rbrV"DNW=L#Өa9*U vzֆ2YȮ(C_8>t EyeLY5lrI׮чhBZ.tZdu}[oW'fwjˢC߮'W .΄F?0#⁛6׃9H6CT4^׃H`R(ZAyJ;˜LcON˕~x΀1T)b+t@ 9Hys^sXeyOs==ڵRVk}8ꊢ 6XsJrlէ%`+Fz |7g+\rS"d@~DZu WCN<|0ws} AWI\_@aGv6ա~)uTh"X9Җx#)-L1Q (( 0!S 0jve)a8K6-dp~0i&[z'';8ˁOǗҞׯOͅ nQ\}|׮Xz4|.)*/ "J!dKN&4XX Gt㠴)߀s sںqK1^T;"E fS8&B* JJb,#D brb0 eX[TQB 8l : ϭvx lN!D"uYJW#XVApX:[ ),9=uMѿ k3^[/lB g`\%2@#zu~6 ضFn-.:D3`+S hs-"a?d$m`ָQDqᛓhf~44oJS齺1=ŪQeKdM; hB#L9@ǧRP;6i8ë q~ dN[lB@!邞%p;]2pK}0c,޵pd&Oهp]Ի;__INN׏rF)]sZM71T_$~ XHJa]uV5fFoiz|~b^89A@m'sn>FppzV-{;g(Pgfn03Fs$Wt5 Fu3,o}K;w1rhxlz`nqwJQEvڹV)xs'.v}9Ud" l\],ѱSt*&~:י\8~Oo⇗?Ɵo{W|ݏoc޿/,x5=_7[ km1tugת f\;ƽ> D Je/?7? .^h۟xt5!^| LrL&{9,JkYs_2.pV:Fd:ZiQ؟caqQd );İylP^]c_-exPQC{NjM4_|3WQ.!K$h"3K"+ںy׿_7'Y^'V V6$\AKS\ @.Cw!b7/^'ANb=sJBi/9v $g\KrUAfx1T*J-s 1X>\ Sf̕9ctvkgMG1[mlBSBEGS)h^DoWz@%dc7;A*⨎o.Z^R{%XU\d73fʻ \v^X(#l`z0'%e `j4ϊj4'oOe뿎j)8{3!&}EbI\F1u`~Evoc֔!}Xxʰ~sXo췋Fj#IK>Z餡[&~I.9uɭ1^:h-]x<\ISb@5M[)*SL,Z%xw#؇mN A^E'5YE_tr>ttx% RqjD;ϥ:t'<|NUؖ(oEODa?VeK +\x[|=]M3UԔxA J|*$;`~iqﶇPpFBB,[JL.0"4)qȾf rx$d#J IZ+!@kˑ" 2ԒBsRy:&NƇ=zLߤnu2׍:ٔ|[fc[_`48(7ځLQ'>zgzLkOŞjldI:ړt&deo:=<-$mU/N37O=UtZ'j/"$d.s_~Vd~33P$o!W?ktC|'mktN 3rfQYX3ẪtZqtrԔ JD2@PVahtQn[xjmNbG\O]]IVW/~^~t0jhy]cTd4 xǺl"dev9m tgvvּG9y[huzW+n+:?i* /S<`V![n5hXLjA%*z}~B\(wp)C83sVPFXpI-Mnc|7OZf7wߏ`(tL3bZoۼRO4S`ʽ3,,9"Q[dߕ69j_viyh?'ʘ>^UX-zg՘Iײ bSX i vU|0e`-SCmYwsG.| {Did7#Ht  *kɱ4$<\$5F܂1}WŽR|i3 谵oz52v~aе5=iWDyZк\lWw9nmͲChC˼]7znEs-CL_v2vm`hvH է:]ƗbwέzMy47M-U]]yH?> Uc[AIߏtǚj% =\qу9啒2Z!Y 8l2$&s0jj͹]kfa8YM"Z;Xx>mĨ(/?N}][*ˆAmc/{Lwoѻ@*[L> #ߧWH4j_SP eEc:VL[o7IJXR^( 8CvYI(J1,,VJ\1j}Mg3I xNiP]nG~B78@'X`GE*ʶ}C8EERj-p[xMX۞ mo.jTnC(֢Ti'bQ!8>jK ^i[ j0D ٻVEuj|_\q)]#w7|uQU F&o 'H k c}9Aӆݛ>}V!!: H[ Ksg H\N҆5L1gO˞=mvh &bٚ{.Q!Z/B(Z^+ IAtA x'i\Y++ck-QeJ6isgEE1ŘdJڰHqB7. BYp{sL((Q&f?0XDc:|FbD hl yJB.">Ql)6\R\ɱ_y7ѝ={:z 1w1%STx}q5҇)%j9 eN)Ec@G^u8g9L\vW,g  P PvggH2$1eOv`9141w>m[&K6s^Ea&Ձ 剷 Za{NYFr}gj{~ү,;s•0}7Nz)ӏ;=X8씜ARĻ׵7KHFgAY+zf&e~~䮃wմRs[2 O\ XzDDp8/L'=sgf̎|u:ٱcexb>qϖ^ujcICCB=jHm6wIѶ|#BQ5Qr?G~[ol(ޱľw}/Ae4(+<߲1Dz8M~3ޛN `$P}E'< zؼ > 9oTSDHc,ih2 Vx f$D$u r(Eya{I s//ws srm-{hJ8v[kwT.RI9냊N;: ĵPN`RGTPJm6>)baL-lU\R]лАZ')Tn8`k U cJ#68R$#?g)Oc.*51VGDT.:}Z~s;[*8@=_x:A|9ܗȯ0"P~\EA}hM]?~ I_TO9i{-Lp݌}ۂoSk~kW 6#!-`#y&S2k1fx7`pE+XQa"v %EZ5#rkʷj  몂K>-h1Ր~Ž!^## ʹW G;o)iJI<&ʠ@QV7P 0!S0jvek^j7Zg-DS; i|]kpf?Iu mC(DEptH}Ai[ha>rE}[x0(q ,953qHڔD?sI[G F[*#R"ATD2'TX\Jb,#G 돕A%cuX/YW2!jg8o : ϭvx"6p =DuYJ˛TCJApXjbLaITys0o,8HUxmّZvy'ùUnD_sݜBg`?Mg\{X͈xH4fD¨3Zb$b=fH7%g( ߿Ns0e??G\&W+~'bxϝ;d*+H"a:>RLݱ)pi{7q颾6W  ˊN[lB !Bq!x*ti5?;MV$>T;? Gtk5Swrn:W $@o&ϫ"׷\yG#`]&'ϿjsI ß0VP&?wީ3=M3¯tݯ+ oGf}atv*̥^|[ou;3QzTM;7r{!Ԓ-1˦fHs3YcCP(`bDmhi*A[dS*V˴&>`> ]zu +vҌ >ƝxN?%`/w\oû?ޞ>uN?}| @rTm,¯ 5ûѴniho4Ulik.oڽ>qV mMʁ(۾-]*y7K?ivbd'_/gfu΄wZTW!DX4 O ^cay6X#ͣ[ս*1vJS=4Y2}6-}%`[y4H)ʝDS4\$_vRg Efi} `CqQ Y,&v C} Erid\|S u ș sBL(dVh-".IFEo X"m!/BTm;1ҚMMEwĿWPƯ$[jݥ='^7_yTwz[)8&[l딓_i[ 3-` K)FDlU*6uip(SBD\DВIѐ@ 2-GPN,ZRK]{;a1q8ا*>>knqd48oަ|{S,_NYo}zÌrDE  ~'*`~vVFowG~ja/-OkEn&CsIۜ*i~S&eeJr}AkpO=$aIoE-o |4G?f3ں_aĕ ˏ7VoYfM6#J2 YjcXXY5͉3Mg^ӣwm$ؑ08&A|kr(Z[=3Аmԍict3/:X|Qiu>3o^A7g7,iW~jr$) {Wh~p:O]Y0+ho;x sZD)Z0la%)M8dA0>)'Kś\Q ^"VFudXDcZ)"V"D4LD_eQm8Ξ3{ow~t:T!=J%7/\*7Gy삡J6*TFA ƙARL2~M;t9VgQ I{"@ F/pJO03Hz4%GwڧLUz;u!̜%FٲfHbJ0z4H"ГQXݨ/^ne,:uq?0)np0sH[ 2^5`]<+z7t 8$*8ʝ7\ 1qV{tHbaJ3}[^jnIk9^=1yZk5ӶX3 jZNn_^XE\LƔEhTzQ1qpz!1~1xk@BKyL0AHRVa S띱Vc&yeghj4BZ"*Ls.xY <0 u,>O[4 i9{vT+We*UmeiYoz2WmuJV:Ρ!T$е`U^0 O.\.w7^FVu6fwMVˏo9s@+Od;"`[#({(:FGy8Dz!Vw.CA͈ 4`/= WKqOk 3{&+qXR{ ߄yk⺠㴂IQA)ఁ릍E @Ʉ1WŊ.HO]<5By2?¡-bi6\=h40Z6;=9)}L|BvHrEs,LdH:^J?y9ѿ,WLGh03ѭE;52R L{Es- pyKGLSL1QE*vÁ r) <cViǬQFYJyN]S-M)aSu6Fe KbcO&TZŌ cEj{;\ Ο >F ( &v ~32SWhF*~8uH xs)""7a>|vC dqCIʩqoR N{ޏ[Z<5|oU0vT(!%s]뮺]S蚑.ߺC]rKgorm_?n<,]:\vr7M[Otz\5+&Е#0lP: gX\D5lJ: Q!^K:o1Ҳc*6*&1JNh2v T`mcY *0wHsIMNj۟&`v, 0ua3o\ ID;f>ƢNݩ# \T9EH*i߾p:ήs\bF!B3q|Hz-u0*z^0i,yM2_(s#Z`Y!\C8XnE5O x˿#1r5jS0[oa2-l.:;SX?P8a$bcP%8Ƞ,8N ,4 Pg.D&gS:ÈGp̥EZr\%iEZrs(rz.ri\\%깄K.K.K.K.K.r.BSo^(Uڐv7.v4<0 m*CA ƙARLGsll5\% J#2 " "\|bzi{ɩ1kBzm9L0+ `EmI ZrV8'MaW0yPH9D ގ]H$HC[ zTZctQXݨ/5}e,:uq?0)np0sH[ 2^k`]s1yc[k5ӶX3BH }Q?./$JT0zIl,.hAB'I,6f/EzFY_$JcM^K̝9fDw ðHpB7. BYx$I%YŚ>Uմyc_,Ro\7'He6sTǃsN{/Sb~>k~>o+`OZYTWZ2*}WOdd>aFՉz~B6S^)) .B`#NrvmC ǝ s" ngFhup\01$ΐ\l3/ }L5:='vFeF-Q10KM>L꿄+nFaihOQp@n-2HbEf(똥yu{ &Qoӿ|0arroDu} Wp3/-M>sLN^U wUl Ժ{>1E*gDGQ(d^Sce*fI`eDc 0utjC/T70=Gm7 ^`A ƜFkp[([ؙdʎmȶpr~6]R}@bta'~0hX_doXȒ:Oc7Y4* %^FĐґ0х6-e;0Q*C0 A|@N@Bl{ Ҙ.4l+|u,+1 Z(QHLd$&E:yXҠKFb.teC kZִiԴְ#(E.b;ˆweE콞w38^aÿVP}GuZVלwZv"W`^śug$UN&_Fhrs{r[Ӈb1|0 ca;Rc^$gJB|FCo6n6> =08uHz|R[7g90+,M֛,YydcOL_8%Zb%0l nҖ%0w{@Mj#bwzy>zY﬙OhfV̬)g5E{V'VP։-Y^_ZDիe'-%k<(} ;ZmqIMlm{YM'GYl]i-OÿcbJ#Y>V_zظ3{ӅLgÏ=f6yHM"eec?LySFLY282;:Z,MUϖnb:諙mǗWʼp<(pAl+)9^MX˖;q̲j$ܚaakCWDw3&c83N$z7hȖyQ,AxRPnߧTv柽uMI99GuαħQoCtdơ ̨b6Xwm e$wgkp[=|K)[g]SU}NuuW y*JVϵ1r$] 1y(14`l9A<3\Q/ֵi:㭝Jׯ6R{=Ks=?}u^?ڪp8E6u,I0aLGi1M}*c hH4о5Z`(3*h2:>Q9b0XJFN>:dZĹ߂Wg``r׎־2\w`jN1>WdfDRaDU̲vn/:hx3L}Ѿ i!XWZ'e$^\EKL!3P)F6JKe4(FF dyV3PmRK㜵 @u4D{R@qНiY˽U9A [_~sԻÅY9^8«u*DCa`2WWb̨GT'fXM$.R'>I+N|R\'։'8J+v4*ٱ+V"}*IIpvW/]Q%RG䮒RJ*u, UUgw#c"I=׶UW Jjz dfw_ϭb7Ѡ^fB}*Ό Uq5:ן;8Ϫ"n2bp=PhR7U,&9"hkWcq BНkd !ޞG@\%fa@?twRbzK汕cq5I)P|u1w0ιݛW)6K.R9c#}U,dyuA"@<蠔^"ną# 灈T@$ϽJ;C@c;; a<Ǜ0\alzoc%CK;321BQA5[$$#c\rs  -ӉEh0-H-ӥM*.&v =z$BEӀspNi9 84Ӏs! M:s ?nB2h͠3U_:ˢwܩV.wOGPZ`te:!qݯA|(*6K'//ƵXT*H 6M,sy:,V_c0c!UĜ()D`Q LjN<.Rc3eF3<`Q{]r,' jyv&(3|LСv2FX!fX`:5SczfXM 89R&gܙ+ݞ&ֺ6Mgn=i9_d&}{VcD)8 ϨfaNz c:JiCVK'0E@Z`1"h@QAsgR6r!Ӳ&g=;ؐ7vn)jWdfD̒bAӟhO*},|ͦ$%xND-hh)r9]i0XBB,ZZJL)0"h3<* M,ixaRhwQBc(!"` ()<@JH/8xm9Rrb`ZRK/V8'!Ce9ϣ7+{/C5H80hǹIQ' .¨5q- j]9 e9&Y/0UܢAD)T`ɩф՘pAH iS!$t>!pˡNrTxGD41dNF03,}N+ !ʓH&pZR}礿\cHk<❊`)Y%tĭRIԿ黬62$* &lIŜǞ:M)Kɣ6Rˆ;\ WWW}u3?b`g".=uf^ۂ?םZ$ͥH\\Q1Ro1 x8Y>1TWƅ_LU{ݐJ{`;q-VGT")U + )8F"rrN.Swb*l을.'צ?@0Ь! 8:o`@:/Γ(4Qԑᦸ2#ojOwuQ?iWM'niwM`.+0q s%L!ax>R0Xzf_pݯ+?^.'-tv&L|csq9[c(Ur=g?na5$xcO iF5w,}j#F1~f:ռN3Zp7J^kJ*9C"EO .ㅂ[~l(k7/su >|<ߏoo?}u~׳?Y/0rT=Z_'cN?u[]CӮbtHO=Mm>rCwGքfskm@~*~u*oZE @Wr}_W@l0xe*ܤZݖJkfR] h a#E#-l N ۩Mp8o?@ vwϛƾ-exPQC{Nj}sS]~ w/YfGba4>f!޸ ń`юdB=h3wtdԿRd }u L4 AKEkQ`K&S=B*Caj3Bw,`ZF hFs+%p`klP>?K[ 5Ni[W7Y1Q]MZy\K7hpvjzeٻF$W}݇Ryo`eSE2I !b:)l+q=j̩s3g>|ӭ%_;ȝMF6ͽ@hy[=B2Ou n?nfwO9t9fOA^vBfέOky4xb̼2r7?^ -vGߧгx `b•q&d/zQp1oYq@Vo c+[O\X,'5eʔՉNsX]5k Q&T_Xюbޭ^к(6]`;Xӝ`cі;nȹd:p>gu|)%?/[l -U)HXb#%0DŽ)&\?Pk g3Je]]0'ra8:n}Haber]JLޭ/SxvC bFh&q=-, 5jѸm$"*NY݀'\E6x2 ǩ Q2t?H*X!"SIrVG|br(,@(lqE7vfO].]җuk3w"gNHOb- ~P$jQ$Y(ך!|2 l=z'YAJ G.k%E \n0RR)59r 3nm lCnҘ%Z\xVFJ)bJbOPgm3fcEt0P/F!_0ҩt@hNK9Xs:\J1r|K0UU΄ LoDQ!hง|30TP$HςBS(z@)E/}P"-e;̘;aUÖAcN^VZyPL(%=W^+1NB -{b !9>yՌR.:ZR8ezrkC-THƱ`c)[J؄Jf,F~͸DBUXʪ ЧyvfL|ş h`0z'w#w>*QtT)6"b1DTJL$HXbA"kD\YQwFV\*/I{DD02!Fka]ۍaŹ<];Ek+-[yEF\ϣB5"EDLsS b]H"nT֖أz|[RUƪ˘1HE$AKY\p^CQRţ>ŭs2q|S%JZcJEK`98CbVj҉ TV ɇ]mE4 L# M"JZ`ҥT#Τ^0?z,3T9ؘQTx`5 `,$T8{a}$CR8nq7HR IPP IQssN_"<һ%QC7k3ܑ8irBc=.\z!|TyiHNB 1+ĬB 1+ĬBrq!YM\֔˚rYS.keM)5岦\֔WS.keM)U֔˚rYS.keM\֔*cUje*iUjeM)5XJ\`/j&wOEj5?5SIUO5J}6+ֻ5)vMJe HBn|c1HBjHR*y),/S4 (W`t}E+[cg}/=0iޟuxzzCzB@n>E>]0%x1q]6o80(CmEy:ol"X0h~>آq~gq؇E޶[u)Lo/fEUrsrC$.}b"F 5A)gRXRrT@ dZJM6U@6:y8dD`y#vO"o㡆;gYO\\JɬYtYW 퐫'*o,OCU&a`CW M>^dRKO " Y:.|)B38Zxtv(]qfQ^a]`yA֎ X=Aܜ?;N̹>xO`q\Ķ#4^v,|~㸜 DŽÄw>c՘D⸳y/;n0SwBc(N*z-†~W~PԤ8Ox mk.)$=R DmB\{Ka< DB/TMV>919&|>ܟKz1xm.__2 l:{]u)BBNzz]3N$nNlTM5K$PgGCE .3.a8IɲT.Bdz(\5PdYt{sꢡ^ _n.:P6oWJRcti"(I3e QfZ6H9ZE]$w юsQѭMv^[z"Uqiwy \g䝢6>6ލr xr!I(h ٸHY ZLrZ=L;@aJI,D]Zk)GL6@I'GDʌ깾w?fug,;v^{t wCc7Csykoja^H-> *f&) 1&c<04* 8FFC4R9XV[=)Zb#/Q˸<%9%x+zr<#|W-wIS&vճ4"Q.> cƮWKUs ^5W0}|iyW$ro4hy8))Z@w+} |=TB57<2R=+  QFDT 1= M4M%_.E"\P'sd9AW.}[X*wo &x}~n .$ʴ![O kЃht20H炄ܭ<@UT-"sdӴhC.Vu2[Q̣Ʒտߘ79 O3L;uCqS JXF6Lkހ%qֱ$IO\WګQoO+ʝ-E&UR`9B"WT$(w 8K΀/ʲ-Ѕ#0H©I@e$gSG'¿LIEx* mjwH;|^i+oہxFBA.@ˣAзC\⾟,7v>u6|ĝei8b!cCji0UP!tx;CVM2[/U M}S3u }"@˵pAb"&?ygLD |ѽ}qylu얼x3)bsiͨDudApRD iK:= TB)jcsQ3iYm\D0224?C'90I{gsԯYoдOؕ9h?u-ڀK^x-!wh|Sn4$#QDKքX/JpkJg(D{KIQ* jEH#sI+,K^m(9SR 3%#SHM4SUR\Ax %"Z(@?c< ^*-C(835+\sƍwtmЂ40Ix˂׉Zj7KL34Gy'aɗV ABRzqiRtޥLSŽ {61*Hi ]‚YaAs ~O_f`h=a |pwD7zlb㈚ie﬿̓0LCރ9oϷ\WڭZP69gI$7|OS8f)-Wo#ѳa|l>/{**e|VSF?Gs+(bfP<18MHzR-Lѽ|*u7s{LJ91;$Af,zt] +#xȮv>D-:ǖq!#Fc~J|^hvf2cRViM\CVT7AVNi1@%y`zRdN%fT8Ǎ7~ᯓ?!+Es'Gt`ȋg"xe1 ͩ{ݑ}vH+8ٍQisb%jSLH90q[L` s!TCXM'U掮Yg>(N7Ɗ>x&lᡧ -Pħ]~7_tp5I,;Eش±De\mӨ@[3H{+#TYM'4Nf2zLsj g7~}|B7$vwt6pʬ]ߣû _uE؝}%hz zIqHrhڃ G;ĠmΕ歷wO9͎˰}4eG-}}ݝ/Cyv}(!6}puGNjzXC੷lTީM5zdjK[|NJ|%P7_e E3q5|uf?:tNЙ«乤?W?~4ҌWWaXJJ 0FVj- X5 V]pPu<zf%%$h f(Z˄N"I+0CN tI?mIgAw,msaNpƄQǧ8~AlÆ\z?q(W;B Xy4`;/v_&J!D8 Uܧ,gRY@Yl#X)[6/ WS; 8}Q\:::,ȋhy4hnG5@KF ,:,u(袵۶+!D@t 'gjodeپe40 KiT'T;"1NnβY0rQf.s"s]PR"Dd0ի*2X__# g(gs*&.X Sxdz .A4ɂ+$9k$ꔖxqqmڼ&}x1c? bsAza[Ey{ %T٢`0-H\P*[i?E9CTl*[!WOќŰS?Nan}~\RO fr,(&C@pw'8~}`F[:型%=E!-x1xy}M-|,ȅ҉/?pS/{=ha62笅6sJ.>eX{5EovćYڴ7{71Pc8`-ؐGW]"L"bbA[\pvV \UZo\R+pUGUVs7WEJ\AsH1V*i٣"%4t͕VԐHebCzJ;]͋)vK?_Z(禌 g;϶C QtKh^ftZ@Y*)vfP٤jnyry"3`d̨I{PFkKIe }T&D_2*q)8 at Y G^E6!"49&Ǚ3_Ch@+N g iB&T,hˬ 15.tQd.0B?DbCo.%HL99-WVCFOw1IX4 Q6>'u\sI[xnl9"颌GY D CP֨ѰXTI!i 7>3+J !<(_p7fbӏb9g^5N.7Y))9pBB" LRDM *S C&N4\XdYqE]/ QM?Gk]}vP.<s6xZAk*VƲB,䀅A2͵d@T=W59Ba S.Ct;\ K(NmЕUgaE3:R5ɗC:bD)h|U˨Dw $1) ӓV.s夹sHնmj\[h*BlQmZS{H{-NoqAt~4|]Ngbgb6Lƞ'cagBl1#GfRY Y5fZli%Bd |LP̦ԅ62hHFd=f pFEc+[ja2N+Xjq.V46?=D 8LȐyQfCnFMl}CBX$:+j춇S?on.S.\,b5xE-mYYr[je2@ӄL@H!Z:knd1+Ho8B'Tc~a۠Lk3gڠg&EA ,Go[,_Z"}HŴบn8_ Z.נ33 C^ϖ# lA3~3?{Ƒ 6`[/ ӼW2.78 ~p[[)Zbu7uoEzc mrbl߭d.V0[ܟ`[ m{V(Y(f CL淸WPTѵChMmvXFFkխֳΠLxSTʨbUUNgV 41e[N^4g9I^PE5Q;ʹpamaM,%LF7+B>bQut MVk̃m8-o1fi,hnNܤ>Z- <DB8NMhXX Gt B8FI@|´=:)w2;"E IE$s2JXJ3K`@7ތ/ CbwvJ̕ۏe5TvnXMRw?^a5$xH/i : f0b~n2ѣ٘ݫxi8*AGdӨ*QGis!Ld$ ́}1Ww==VuJN?κvu/xou?~x Vo006@ѓIiUZ+6]R(M?ݸwmY1VHhb!-bcuN2f!޸/@ Dt3A(0m=D?=:808q>xN  `D%" xKuN~n0 /==pR JMޜ W>ǛSхKI3,wN>ri~z4j-`QfT()r j)g\ gl?e`Y8u eP\H&C-h)5![&uŸ涕9]MtZ^MA7@n:VʧߔSu~VwsO|uy JwԩRn Nׁݨizpc)nZ ?$bg ]jC^/Z|~f 3-`Sn+3/ }LGEn\Gg:j1 %4"* < Zk%D)B9DUh{dN&ySs!˩~֨FϞ@oH6aE2AGC.i]sc̱b<IKZR0|%gx&$}iNH><ҽ&ѽr$#o{G)`J[H'HE5eT:F^ݨ.ջ▱Ņ,.2`*S"` ., '1beP-Ju,e1bں%-hXPD|0aI T)J]"9E4\ KÒ(RO&вQ`-DG|C&ê ,_TiuN|Nj- #8R GS y NƁ=r%G˄K 5xg4Y5P2zŢ(b'8>jK ^^] j0DPR5c6rk(e/0gJ4k=GIB|j*$ <58ĜS)O핊yJ.ni]*{7?I&@&EΨv7`{ mu [W;bd5~st5\sRMx4@c%Qmb<II#.)Dc *" A&y@;Uъ\=C')YJ.Bdz(L( JO$U]d1^6p]MgC߇e7"W;j;jxfRb,NT0V5iS$ʬ\K#"s# ݲ>P1@#ƀf"P $Τ eQ娀|^?j}^dk<,s559KL=PYrxPNrxUd&{-w;z~K绾nhjqsu6Ί%)dd"e3h1ZK9cDM89"RfDtnp֋q&~>|PѲ]҇^NOgc_s.x>UӔEj)$^W00IY`x19 MnK FFC4R9XV[=)Zb#/Q˸ `nrf ë/WUٲ9gƹ[D H./0A1I BZ_'h]T4@~P F''Y!މ˖bvb~NEB3כ,Wc#7݁`wkwVgwS&^LgϭFKl,^3)L]RfAurN͓>&胶6gj8̟>Ɉ-|bg9u`lߝd.N0;|9Yw W/,XPDԣLj۰&"[m,C?7I^wvZnܝxm}`tsDq $eT jpGt4Ef\x$cq{E9Y"o~z:cH2AOLR]yVaf?τbJTl<fnģy]`3m}m^QE5ItRअ+>IZ@%陰j $$jCJ*^ i Ҩ\|BEԑi%^o5́>7;sJ['M v^Wk!.R SC3LJ-6RS+4+a D*X819ّP!}O-rrigEHx4yyl5lF ;!V<LЌJTG}p%bL[|: IՠP)c$\KwxpV[8QȄQh]P>7l5s|3[u#غϔ}<]rcSg57Qm8-9^`h[_h|\АD-aZb*a<&*L-!&E\PzABl|¼:+wr)&yP)*)M},m@x %"Z(@>qǒ8b_zu-kθ Z0f" oY}jǙ%p#Y/LZTФK a{hmb9Hi ]'H- &aV f bo#.>03 T'rx/GDN7#5ITg, G93~ev=%P=>895Aރ)( rp$;s7>LQ ā){$rVQ F>P(~sr5Hqq8f)"slo?óxݻ"G G{&::)DS_*!s4:o;k=Mξk .V (UrO[}U0#Z~|(o|{9=/[g 910 'rs3 "dOYOޝ|pqijIƖ@,o鼩܌nnfY> 8: G0bz6Г6*#[uƾ*IԼI^\3 d>r+rDިZ}Ǫ2ycauȎǏ}}߾xL>~w?|Vo804$ی#6m5͚hX|vE]nh ƙ[:}?qwy׳r'/:XͪTAvza b~^zyן(]0ʟRhm6ˋ>J6x\"}.fpghV}2?nZ*c[5xODk.2`NN.oJ۩s ,s8tА"XjY>I Ph#3>CF%9uv)B2^;=Ag{!gBωI4- DGT0DS ډ-BS_F_(;mضY|;VK0`oBV[[`W +zo~~4yU&ޔ}SSG޾| ~VS"G7r`r%mG|bjg6vxq6(ӆHo=.BH4:{$ sA L'HNx[da_ݛ2^ްm۽R[-ӌop˄c 'L'of-iIa8.q\`(N[jӚ`+u )gIOΫ F=WͫW;1N[LrD`IP p Xm/V2scc I2& O5Zks 2%m`9Zh҆y]ZΗtk_=u2r#A> |nۓ!qO*[ I(e)8b!cC*i0!Gl_H6ꔷEDkM'6DB-ME Θ g!0qK-M9ijn~ G 5eٸqoL7->v OUi_ iCռ= (}c!BZưj#מ \˽Ar}r9?J/&燮9˫_:EQ/G-S/G%RWvjөgDPMH]!QW\2캺T.N]"uŸ7`rTUiM0џsap*1m[t(hg)T -@ LqVcB,ջ C1_ #~In%tb \,F|q53, ߽]kju33$I6A +NBakͬ[M$2,A KH!)p']096GhŞ_%ild;cA:zCF2(" l>( DŴsZB:,Yipy01JK?h2B9PtB1u2$ͳ\_] PHfZI **޵"K~1ǻ9 s A cd8m?!ER!%5ERj/V4fUWulD?I:C:@8ƈQEH:!#FcecUVYwG)qQܝU?-fX\u~l.x~ҳD0d$<2t2!QKޕ%ZMJʅ{YKRԿ,/KRԿ,/KRrϥ #K<¾acT_Կ,/KRԿ,/H9)/ */KRԿ,/K˂!-)ˎ֛_q\T)W1D+\7\RU73۰Z- <DB8NMhXX Gt B8mJ"@X|PHwD@$PHd 3âҌ;$$" (O ? + C6) MhcMa5a݉)_; Hx Hyn3;SdB(N3KIaT ᰂ:#2%uR1籧i|RAhbk-Y/9W 9W#nlv ~55l^b3{3HK0;o>B 쏱Y㾦AIqO>O].'響Hא{q-""lØL:esH)D0 :RLݹpm y}[  ΊN[lB!ď'y\ 2p_K0j1hK5ы_4%Mzr` Ǯ s(Xr1WKesg!aJ糵wjG͇'7#|۸>kUpvA]sm~ Fq8-w{(Vv3kh0zZi^xeH2{Ufh($\f}/qxɮU;U $ ,d$-́}26oj_,y/9>/ 6|R{&;l0?>sO_~󯿤>|_||:Hz ك{Kε4X*Ҍ|vwY|uTwB/F(|͏#v)ZzR۟ttwjUzC[5{L,c0,QK$O+&9K\S"9aMr|S)h sFaT\&lHYG* J3!^٠LeL )&Ţ9ˇl tT.Qj ( * )HZU+U:09eQ#g9R.{ w Ԧ~%_z8tt tDND.ç.Q+ȱ7KT*TܽwW&p.vM'6U$T"_iY"`iLz%3ԝ; /WJˆ=o{M88Jx jB z< 8xG?jW[Ƣc\Ẁx LqYCjXXLjALLg/Ng ZGK:bh՞2#,RFXp4򴖸qqu;Ϥ_H1z<-dcAzN8bgz~T_4H3L0o,hqc ȓN  +GmM(ۚE5Q#mBZb1(S$xm1 'q"E K',dACWkbAcARGOH] M+ɂ/y l=GGB2\91&|neԕ}S>=VX \1$cesJ6Q'}]i+eX: zO[9\YJ\O9\q#5 K} F{00C |kA+ʼFY,u;.obMR/,@(vGE`.0 r# T)0,VJ\Ă\lp9mP,퍺VP5<+RL@ԊK%b*x5sf"(ܚ195c?J9.OՅ2.EUU$o$Xா]Tf8+؁Z$ l+,qqD2Fy T@$Ͻ0kHTȒ̃.0 jSAܤ6N0+D{CJGt`ZD 5}3ێG~Ÿ<;Ekˢ ){%&8M"pvL^6AStw 'YaU3bVA ('84#ȁQ1a6rƨOצb<?UfF,㥡iJYt)$@9O !`IkY*H+0"d`(QHLd gN \j<$53R.2kll׈߯|b~Ll\DHQf^,z!zuV f cRHG %uBFlNQB E/C/>G29kkW! !F+P.FH| K-zNF4עr^"W6 W! 2S4ҷ 2)č a#hʟnB@y{anK[#33 P1-'xO.\v}~~?8T\Wn꼮*0+M,YydcOLDX,-,x G]Ȧ\twLpڡ2MQ{$ޮ ]F:gZ\HE.G.]$,YpaY 퐫 V;P{~0uWc'ۈzԑ6va`G͡LOz쾆YrǤ6\E=$mcb#Wt6ۿB+|u;{6.3 zCQ{Tgaz˜c LZ{l6PrӱA%hh9 o;cJl摹x۞q`3ţy'/TZ99IA_OE_-8J}]T( Ea›q79Id Q81C4< vk=ZZayǃ {ޛ909&|L&?ozg^tc7,{?&^#FzPVMf*@+ o3߫(XUZšJE!h: ߗGBQ y/HDb(FC"ÜS0 D`'{t*ȉ0 D^tcj|! ~Xt/k1*bNJ"֨V&@f5Jhu )v1òYL%ԩp^0!sV-}qf-v=J0<`kuFMl<7z<u`:ʍTyŢ TVÉiֳb˵4m<Ҧk'cHbF(` GVF&)ċ-glLVko<;-%ݵCk];_Uv ~Ώ}VcD)8 Ϩœptԇ2PO!o1]1"h9ZleF͝QFG6*GT ?{HneO 0@#@&M& 0ZJO,z2%A*u{yyJ!6 ɓt[wǣp6O>];[嶼c94f^1S1KC3ƭ,Rg9rXۆ==ޏӏoz?7W~3=n0{Qi"Hn4OgAԛ;D\.mz;_~l=FsQWߣ]9c#ëdxY/%V x9(JUh :9h@nǫj뎗!.)ERHK>XR+ q"%g%\N1酀)œe&P"HFx U&<]k\ "8O` y*R8/I)Sw'ng^05^1.I *D8T^J`!ĕ'ׯVIE5{fS.?'s>n5gcPZITUrFg>b X{S)mx{1/U™,HλH*h 1 GZDrN Hec?J-}#LZ E1qNaz7gXޥj8-**B]#;px{wJ){ ΂̨V_ߢg]m2??勿IT <hͅW6o|~/}W[!Ər,UnDY9Rʙ5Tv+\99Pa\\eq\\e)4W9dn6jWs1WYZzsԝJ2-p| 92*K 3W\I܊B;tq.*KHUR\}@s왹ʯ/3z?o?7sc>_ Yau_ogs_`/Ak*j$ʁUQΨtr-fCW`'X|AiBGL-,Bx#IH9d#X[|b|f/)S^r4uv2,JYgRέ@nַD~5|>3ٻAmu[p?lAoN떟0-|&FȠfrBv٥GW߾h8HD3)H|C21)#knu {Y+^kI|:|>DFNc*.b[;XKQK'[ځWI_ q5)++"@պgVN-uf kL^* sGj)\EQJipT,JV88/,;- '[.Pc.l׀ip% OW÷T mCϦ"0ڸ8vTH(zNY` gsUg㊐5B)5>"K.F+MCD>Z%Aj2t&(UE u g ٞa`SRQԫO#V9v(FtRA`qTt )j]u|^?\u&.!%J?zA~w5ΦW?}7| jfîj1 CUPOE-,KoR11@eTh:~zwdQ;UiaU^T"E*RHxb*'A%TSmX狮>9,%(at\`R 3BDN HBވ6θw6oo^P4o ^wZ6WKۂa[Dy/X;w"g@.ntľJ)d%" 2C*Do)x(eݾX+`"ȭ XjJjqD8N$d)PP([Zhc"1MP RHF%ֈ^da7bQ|c+,~+>n`Uam(t6nO:,܌^g5l]7˻N>_ڋ*8gd[s_|ouu5m{xG6]v7nhb [Vl}yۆYOww煖C䷹_2!KtƍiS/ǟ5]tu1~}֗]I-,2kJ;J_QIy(;g{R5DGKS &ELӿf"`Ǜ}ִt&hM`$-KNp!(Z<8!x8_6 WzÈJԀd|Is][UPIk T:RD!V٭jTOfK%8 A1TBnGI L#J҅-v1qv[l7]lvjjwvیg<рFoX 8".H& Tr t *L7 O V&!9-( CFdHYQy"^"pDX#8aMq4Ʌ7mǾQE6f68vd.ȍcdm@@d\McFd4HqIRQ.qy QFI{d g-b1qv[ć눯ڡ]<ڴYli9)lMg;u{˼ &J$`%R93,&T:kלqkB(bga[cK͎æWgq 9uޏ'~d`H:3a0_a2F~]Q I0tsu[[[ȴؘQTx`5Rā q/qH =u2E $ rr\0T&H@x,59Uvb)ÔT'Vԩ83ZjB)F:C-q;*<:22uⴞx>:y" D6B%ֺ}dv+&(fWT'%2R6V:IUEKh1t!z)2NR ѕ\$; !P0MPJO$U,=XS<٥@yjN[+ LPRti"I3e QgZ6H$H1= l KHc@3C}(*gRX:rT@ >4,&n%8G4(s֎Ln6 /7ڑbp!ྚ&yIJdlx2#c$}#ǮvwYu#͇Mm^c_r<ڪiJy"FkUu,?0<ĘΙ[%ZŲ!&P;kd4Jc,o543J!6 ɓ!&٭kq;1m8|vqmv9o3婘lfqPP,fm4j x2z {[_L(2ɜWESehP1Fxu;^ݎW[w qI@|NhD`]!xƒXI)9.Y$[I/ K@uNم" Tx)2T(tq,<҂F0t])qv.W562D'cG`xb:z}"zOc9H~{S"EL2" 2$hu1P;@-/@Z߯ëlWj5{aR9g4 7?Vk&| 4 r;e{}X4)W,Mܺ[*z6}7nYdn;2v~9{U[M7ԛEfNAVs~c~ca?Wժ-vQ,QQb:ܲwk4SN1O+̺ zuybJ1ij0m/]J,;y)%ԫҰV6.tyxsIgdV;KNo͞jl$e4M%2E6 gEh#-5bDޱ`6{jX<,sjZ;Re;6eRi(ط[n4[]zJ u:׾e[PF5N6N ~<^8*IτT!!QP"'ROF\/TMf@ |ݶxMqE¦۪s͛zݹڟ\wyr4Lyڎ+q q]>(aY1yY*𧓞80}ܗ?z}nxPH)xE3㴥8I.NːMC.V!೻FP.mw#P؂lkB ·ꦾ{W]Ƹh+iӌIh汓r5;U9o@ PٶH념r&_SHk̞=3\<8r\cKWv@Nhҷh@,rW!\B; 00G>yڪQ?6;:z:*4^,@iKz-i;r h(`Q !8Xrj4!ba5&)g1i㠴)d$gxL-Աp0 H&:&(f!i%1"iH&7qO]Cnhoź A%T; Hx jgw*b 0!J[ea}ϵ\[Z aCu6Fe KbcO&tKɣ]H /M56sU Y7d};?sbqT0RD|?8/M-DϠ%K\\Qֿ+b$I]l`ָT(-qdf.=4<2(be2ٿ)>; dB#L9ARP;6aqz7W  ΊN[lB@!RrAxB„ڻ8O\zD>dt~`dpWX\%oU}$:9F;SvUt*M5}z(-JlOU,̌ka|xvtY=8g 1Krl팡X0]dpwkIږ_u]3U,|C_(4i'N\wF:qmnuɺVk*V Hj?p!ȯ%|~yP J _Q+_gss O?}<ǷӇ?cvv[u#0 .GՑ@$یqM5 퍛4#MuMv݇0ΤBZ, J?g?~y:Cmӟt45˸! u3\UEAZ U"D مߧӸ]ҶstRGG?T<(&An?l0-[UylinWdX~v7'lݴ_'/4To~^tz5~7, 4cVni,ˠ1$ 1&&E.T8G]zc$(UeӨeՓ헸OWWN=+y@ & _W\&E\%jg|T\%*j+W0;-/Ӟ ML/E)n@qAGu uڮ3|ǝ?Op/WD'p2]yjZ" ])neo) (C6oՙS*C(oB >wkEۊ7ϴ`nwn([Ƣc\J\lŲ,ˮ#S&|*)u|9`^ʷRP]^\0s, nq;6UeJֺi0cgmI 9Zr;AfV̝ȕfSSq:R;32ibѠu(Z6&a 6)/&rv" N ˇ3277)GYi*ӣ t?Hiʁ:/ɽ:6/cJE9h3bwbLx뢿$0}N2uދ$i0K|QqH(j ƂQAY,E MC4ԙ ereD!e!x0k90{-#@hj4BZ"vn5K |p_, 3|sx&AYqX{FpDQ:s>Ru _ufT V;ջ\IڰB*1wuթ+5<2s}h(Ct^^]= goH2$գYT%s(Wܺ~AW^5j)oWߜyqS5 ! v|+"NXsqu#|sezwe̽ 7ȵ)*bQjnmU'OeA]$_5ZKÎ, L0Kѓ._Jָʗ; f&&#K*Ljd<)V&0+CʘtuvP?-"BTqɍUTZT6wS4;l,&ke-e` BP5;LjnDh,bLy(oKg=+z*\m. %5? H@BP. %j_@JTl{]@Bφ&j88қ9WH]ŨT6*]fBCc䬯_>]jM$_q7>{f{l" 1w"OnV[ee9A* l Ԃ@RGQO>XHͻɱCckc:r';Ccu.\KL엓õtfby@!ee#(;>1lkBbꦾ{W]&f}YnK} !wfMTN#XjfS+R̤ChRmUokLN8S}]5ڥCY&&VQR\3jg>sWL#,8ZI}Y zV4z֗tUCfzϽW ,[co A=ƐlM1! ]IJ6b BGs(́h@$ǵ{+NW8Q#mBZb1(3 ^t & uܸeQBFV)O>)4@$"6逄"( -7k%OƦe7Fg 9 Ottx\u<~pٻ!uu|%|>ZB@PDƗt RM9啒2Z s-32"pN 6"uNIuR?s#ZGl#+L*ÂcC "ю;Os (1֨h,EmzG0əqN3:4Fj,:c0ڃ0X0`R XQ}Q4:fcaŴu^^V@.(qCvYX$PNMroD5})r_bP0R aXeG/'s;hr8©9qYUDIk.JgtRG9^1BR GS%V85Vjlh02`7wIӨdJFX.U| N9ťoW193TӒ1radac, B¹ۄ,'|fUn!,n G߸Ժ ŦRJ/Rh8"#<b* ^@5TҚY`@ϥ6Mb yIػ8ndWGQnY5>Ȏs{zh$H#,b5,COE1Y&@T{/sn~<(xZǮxmSvEk~e`KE =ޮX!Lr*l&$T\Ģf y8C&dȁXtɐIؠ 8Ǭ" Dqg؛97Öԏ?dSq{kwghGq]7NzEjz0c 9I <cF̘Ze5{ D"I䘵2H~G,$472I 0#faO/NB4[+_g/V' ^3pZ&9"E);e6{CC7u+P7 KȞRNtEWA'8|JޜdI OQ/dNz_`\V"gQ/-3-}v-u1o9 Rc<ܲ |tfs1e ϼxtN&ϋv=*X\eʎ Z\us!s0&nyO,y3-k ZE#nvH1U]Civi\NUڳdJ™ģ(YY>b e x\]^'Nwlav^+/춒a^:@\w+| L1bPĦWv܏+-04NOvfY_m+ wW=^RuuOXP*嵘:(y0Pz~Ia< ^?ߐ 旼yyO3UŐǓ3 iLEKte.0|ogd۵v~VLL3eI2\t!9 po|;ӫ.Ёv(@]"/k=!..ϩIg^ɫjr;ˡHT^~j~g~n. X~|-3;s |t J ]YWu{jԓ3`i6.E#7JZf2<6 u=جMx"؆uj #93;CBD> rv"WJ*A!FXgqDdA%k) ܝnIof[e\<:6s%10 $gQIQ%჋ZIKH~QN%w={5aݦ:8Sm($-t(DǓKǣKk"ۧYR dԍO\4i,j%>Bel;<* &gL2p@YqI|cg4 !eгge^XԹr"w:;\HlǾxO~{83E!ʖi#(cJ[f &" W|P'\#:gK"[qcM^XI)z3|-zx2rGٹ'ۿ*\{=Kv,YJkP,[WLp:|oMp (' hegRJ!+U g#" %{7b Bd9͘W9{]It%ϭ:(N`d% EN_HLWEOTM;p`)juї9?~@{[QΟ^OcIO u |o0`EӢ G«Hjk?Gϓ?{)+˴m 9x)\9~KXSM\ bF7U'wK@bR"tB j5s^1KDx"#P#\zp!8)]3} [X9Mŧ?{w̅E rH JӠdQ@Fj9v^:=Kh6m|hG0K.E;TZ;g4G7%s*kx2+gڸ#4ih{`5(u# 8hX^+y<41j@ DqjF[$H茉EC|@IYݨ{FR).$ ѥȲRd Y Jsڐ o`ԛ97^w_P?8t!ABG>(OHO ͞D1ר'QEzdID= (}ш0 :щ!?Y{Dz lk0Y#D؈)aP0D6 I->u iu]\Fv>xc"ꤵ>Zdm>[Q3e &&u^\' Vq=LQlGYa ft]A_ȩԪ Jn {9ڤϿ?FF/b Zn'{+7~?]N,5Mz6\׹wh/L_l+1†HW`Hk x)j?: ?дQG!Rϙ`Fc!De4ڤXx2*{oEQڑzHwɶ70*0ߊ|دϿF\͸}vhg\QO#>+3qeFc _U$'9/ E3LO\Hy]`:<=e5+)י'|\"ekZ a["erh) {n:MtFNx/a8jP=w]bı ̴]K .2Hm~yמtv8<8jDgW߇|4x8[gu{nY;QЍV{Fo, %w#Ŝo,#4>jbǣˊ9<ֽR[_%J|WY:-iGJǓP}:%eF|xPpɏ*Ԛjj?l8!5?~7w| {oo޿F?Q R(Q]&e/3>up~-ƾEK  Ϸ^T&>_ܫ݇0Ϊܖjlm;?(~;2ZSFT(2< g8#Z~*D +ql]% q7 5/]Yx:uD ʯCo>tCtG~kyʬ}pF>smJ+GީԷ]hg% F%E4zo)& wBmeW)9hci%xb}HSi3ǧyZ^sy #mq5tz&b-\G)،jFv)oܲjtK;vݧȧ Gb)C=9W%b=5E|.de)z{ H#ϑćGXɲ.$ RJp`l܆O޿q((S1NO<o.$vKĎ'e [ׂ 5bbd`d-3p5ˆ=зOL6>yݻ>}lwYV=KhbXټT⼇*,!{c-$4ܤFÔř//Y/ژܤ*x.8ER:]$u37c iJ={j?Ln.S):^<[y~R<O ܑz38&_#q1MLhHB,"ud7;|Dl/36(:uH16BOVs*{ZDKb2X 3R3+#wNV}dKUZN$dlZ> cq!b*<6R@Wd{q :$q~?Ֆ-7,:I:bDBd"e^mh%CaQI\֏Lg3TdPT>=$` Q`Gu֤3Bʣe\lŪ ,$u5”"c$Ub*2STlͭFJ.x̢ 5sqaX8ld< i0 ?^Mu/kF4#vs-DTxAI@ X SXW*͢5"Nył:וæ KJ6Uv&& jjϗ?dv9.96:ڴ`z8$7NVjKŔ}{E -[* fr=*de5Rѯ`8^RW*<s7WR:\qx,"D\q߈I6qL:c,$IMTЕ3J6!ŀd)}6 A) GI:gJ-\̽=k0"30"~~Um9u%EgbZpqŽ#RS K0k`O.;6) .*۶1pq6x8jt2փ9cUaL= tqflQx#%)XT[ԢS(>V_xw%JP$꭯VGxq|>1;)IHJJ6 lԝ92XG+r4GΏCou<^^߶۟>oVBӦqt; KL)o5ESB4ͱ>Ց1ڂGA>/2`H 0`p !ϙ!*#h᪩'S9'e K+Xu(0TF U"gNNgV]*UGOfR+s^,]Ok5tWo{w;`u+ﶨz~~.>M 1g?FNKh;+ Ԇx.´^v[Zl -WkP]h+E,G_)1%x[2y0!霶SdSK VtNjm﷿u0; O t޵@No٧T9+63mS6y=v707/{plp/dF_ |,>ۥTve"h;!PLͮ5A-OñꇘG5WP`^6O-ݜ5m 8x3dE Lȵy< JmQW1 FX#n/g66jlzzpXu1^ c<]D'TNkNGg_[h pw?U$Xd<m"T"OWן NB!S ^&t&OQM ҶD.u+gRfTfQÅ.Ҭ3" #`8p0lL5 BCY"5'$zIGOUx2^<' x"-B ] 7[8Q+"в6ز0_Щ}ܶ~ݲl=Vudh| tabFVzknVza-+cԼJ Si`tJJE.u&HY(7`Ɔp5] NXƖb؂aM5Kv\kW{Rol?^fdy}ӇrƁm LEmTSu"&IADS[xZ)ea-y1')zmde RnAQTa30s0VkίE[KsMqwX3)KW(V}{fn8Oϩ ThBT|b-HFHEB.r2"d^;yEVV'1ŋE2.^ܶm{\eq쮂ĠOŚ__s*UEZȘCִ.p1ʜ~ .or~X{ >ɧ?pO@c {Gb<| r:V7't 'tܭMfNj5:{  \usUipխ fWWxBpkM8ֺ8wV\YpNOO-&;\hW&t꯾lw/osOz6kӫOr}{LQ1 2|;v| ڛoP{xD-Z5F R T0B^ZY_灷݁9~-3vAT6) !f[!,=Wkć撣漘 VSkFKZ-TΥ$@5֊5 YJN.'a0s3_@W_G=}dKUZN6r/ u AtXkӧca,2DL5]5):$q~?>ܰ@་!_ 4l0(#: kr*CtV)XXIj΅)EI~KN*BO7GUR5ET( 1hdfadUaᰑX,X, \Wuæ KJ6Uv&& ja{9.96:ڴ`xҒ8Y-SwmI_!ew/"d% b?B?%TpU )1$% )n䙞_U#+ T69iAEQӍS*C$߰$$b#ԐR@-Pp^ȴQFBxT}=FfyX+4SY0D?^z",w0A>Yb!"7Fŝ?3&K(%c)O |ӨDJG$!gr5D%ULJ)\U9G1M w!3v\u$ZCQaogfwo`zIj,tLPj3{o(H ۄoc5l?՚/\#>Wŗ_Q=D-ґ`y̪H Rabkqgr4/guwٓq@>b8^;,?S@%陰j@GB6DD0"9,-$^2MEд(́>T88?h}K/.4OUdF>A;ay. (8 )J@$uN$ӀbP(Em1xNXπ;m<8v(C~ɻ .S9[a,`K(Prx) ֺǮ^y-toS7:>ˍn. }!9-"Z´&z%U[xLT:C [BLRV+ԑ69$8 e:[uS:a3%#SHM4SURj`ڀF+Jhq!DE^ f {z*gp :DCX[<37!ҵA Hp[NR+8~Yiۑpހ㬰j kkj!RzqjRtޥLSŽ {mb9Hi ]YH=/A{7L-͠0(&n ~3# P#w1\6eI&u4$960g(/ٹ/o f6O]Nm{JBMwFG'qpKY./GT2S$qŽщSlh6G!xvGj-~jI䍣u,?|s}w)opq /zqqߌ÷Ù^W':]M^Ø8 $YSw8ܦ KYEF 8 e.mUctMWR|H:p/B΄ɓT18iG$BA`V`Q TcBruPaģp>Ö=xc{Qԁ[-O;nt񷢌6SV~^~~4VhJ>{=Qb[g=HỳN, |ΘThF0 TAh iw,cqřeta < 1>fבI׍>c>=55eKo=.BHt20H,~L`:@E_o/Eo_ڝ& N}^TƷEy&!;oyYeG8/h,N{jӚW`u")HOռF[Q{jDnӖ"*y!;*X;ruV[g@W`ą#0H©I@e$gSG'¿LIExFf`t[ Bzg+}6cMn,繠ڴۦ;Γy?(~~2 3?B߾j"es~+E©dz$(f4r9OVEӨ+EM$)ˣ \(uVk7Y[Wo - u(^[>{жMݶ9iLrZ{K45@_ˆV]כOͥM{p`wm|f~1A_4Pbomo{,߽> 7&%ن:-۫{砄ܙd;ϖP~>p'AD*ιC U7cC9"Jf(l=)lo3KlO=&7IJ]-CoZ5 sj]T7  F8Bdiש7>4膷 >ex]B8M2d 8̱]F(C]aO$Rror[iw=EGFiiDKZqjpeaCA8߾P j_^+!N 1Vha6=:H=LPNBek>VezȮx6gOYN_b_iY: Rif B1NFsj@+-@ Lu>ɓ0HR gb <0.:X/Wz 8F J=>XdAL` ܙ!N Hb*e'$+WTR!zKsG)+vԤA&Z,5izz.Z?WDn`VsGHq\0q"4<&˽M@EH6~-J&ijb (w[p8%Q"69k5I\7r̥/ 6vdlji>c-ZG[t"ySe.1)bg솮]x'O7wN*췈\Kl?:Vun3\!קOsEܺjZs&eVȅUHMd*0\f;XMLzum`չ.6&D>T * =x/A9srR*])^`YiE5OV,%MlC$GAۧJRmGa0= S -y9ĵz9m%l?9m'T6O}\r qs4 p,Hա+3ѹW讄hi0+\ }Yh]j1k9UG܄\B4Eu>_l_oރzW!p8uٛ|E鞿&|7aasz2jMkG_(;stJ~xRٍ zv߳^'<ж=PcBK$/^{m䧄X.&Ϙr$1h(vWcEZԇN:)6*}\V*? Ga4Єh*xyG L*n*0U&+M>eԘݩ.:*|LAdᓐ,g'2pD! Ń C,EƕLq |T[;gq8WutQT4^|꿙&oVQ5V\αzrNfhF֚Ж(Y(JE1r<z.QrR_Q 0j3@*qL)U{Es&gm[w9|'֌?Mlsq]>`>I?hY*!b٪'.*oit1׎s\A0׈6蜑鐹a !Bi1fc)''Bʂa\_:sOg^Bvهk,F!Vrz6g#>Vsf"<2DY#f1MIM:V-8X|Jse.Ip&Y 8 l0995as-[pkT>gx,B<;`(dG!BZ4.b Bd9Řǜ]IA* Bf'ljxN.r 6p|DbtGb=Q5)dn&3 b Ws3/c,Y gd-;7,+*~\i)OF5iSb)CeѪ*D-d4cZ_'hEQ[MK1BlІ -aRZlcA`j AGalR3ng+^LwޏYdwF 9gN[)˖dW31TK/m YLG؊G(Mkq 2Ai N+c1`T-ddQ! s>3}IcbArz<(By'55F Aw<\YRaޱ Mf4 Ccbf.S#u 7r挼s<8xz;z:<\q&o$ &2gA g$ja`Y ;V)ǵB-fϞG#evuyrRYebd)I3F$G]\aV(t TZtVߞ_/wF`7En? =$ԣ@Q̮pQ,qs![Km8bn?.+xtT I2OV-)I#eڠCxqGiӗKyHn:wlq{_%X][/800RƳ-0 W=Pu-Z[B +oܲHjˋ QM j|/F͌wro2Sy%ux WH ˧W$f8 7AO< ᱘ 5j!(G.9,dLswW40$M//N?tvKQIֹvrݧ/zxz[-ƒU/3u|Oϵx.GTD(DWM"=:$+" H7"9d_'O CԳrK߯mh%BNy`ࣱ̦OQ1e QѢ:&@ 5-XRߵzL8r=K+5li*{Wgi؛}e!za׻m wחd^64] VʋwW׽j{r=+G.,|42l;?q˅~,ke5kl;'eb]gQe.7fˤq#84J H*1W6]w%\m/x3=Q-f#e]j璆b'O|x<F1@?vhmIM|ʂFpEzdJ֠1E.ci70 ܥą`Q"76w/%&EB-/}aklh`O:豲T$释VrRSʼnKnN@|˥8Zn<]|Fxޠg c\@F鬐)s-Lpٜ9 6{(Aj|Ee5X8N-U B$4Ih#4Ԧ`\E$GbË!cV8 lKhX7RB hP5XDo1w(2Ƥ0#S0tھi#C"JnhhrLp-6"2mR,xҘ7QΐZ6 nX\ o6Aw }ϜXaA{[egK-B߉͕"1M{@Tiȓk`ޅOb ޖ5I￳?s{eq^{d[Xܺx52Yֿ&pCd,N.8`'p&t_O&Sr'i`ſF4+ %#:0mҝR|)nJiѢ49W/G科#0]+uo> W;(~m}{]dQ_ҴkJϓ]fO8jie⌦4>RwO{2#Z/{v»0J] syx6veTm^(wXSK [fDs3Ue 1Q4eL>.zrf?s7*.:jc_bz矣q(ʽ"hQfxCo˽п8#u˯?r?|(??raN/?Y/4vT=[?OϏhڶ4hZؠimz>mvCgnMha9$⦅/P".bݟt=kj䆱_!g^?RYk`)U8 1!օ?ݸʴ1H7vu1[S(vgy\@ v{O6 -JՁYA[4]$Wίy Y'< e<;]VqbXLJ. ,|)JMI%1iV:*2OzdFHfO٢I]V̐O>VX>q+ŜQ K|-9cћOpwz\ ـha2( [RXpYQr~g۝7 P 9Vo]W ׽bo܁o;~~̠l?rr@fQ.3t>VжFC,©U+WV$_qIc4ۈf$KLKa7MVg^/yku^qU:[z]-Z czp%mq{ qKLPd$NdIiy@SD(|7 =ۓ:m˶$30{h; ' #zy? in#vre1'*`Ғ2:HPL*%ݻ4ttIOL{eޭeٳ8ebZUmݓ1d/7Ohy>xuɱCGBGAQ"RGD$/ҚO_%IA*qd AF|EH+` gn$m!wad{.mH#Ymx58< Siw?A*|OGSpU;ϫ ++@0HH7 /!E:G5˕vo$T=/'_#t | `DTw^ ;.%WSjՎyP'F`w qnvoy@˲8H/lAާ/͜1)II=#lqqtnt35S[U@\ջc־l ŸOBvQ~6,[o`fU?mqh?oͥMR*rD=)h=_Y.EI~l%$Sr!U-:WkUnz)B.ڬqiA5KYBmm3ڿ߮dV&im`5Qu.*nFD]\FKah;Σd3UkѦdS^]\v=+4sc}Ni53\Y~z亦ܕ)!&ZֱXa/3y`mx o1Z# ZUUiԒϿ x3S"髾eI\nmS1Jt{ 7 pTY2dClt) F!sGL>R;n4ƈa4B&"7*)$/D"a‘UWH#.D/Ly!t*6^$TY2|,R' J!w zY'sԬ6E5GIl10ت#)̇5i794ˁe{r5njGYG7(k _͸!fmt&$7,{E @b$A db5WVcG=D &KGVG[TM%~b6(8Y`>f)Rul@8i "edC9;3[ |5]r (}n9B6~*]T ];-Zm*``ef@yG aX@h~=V24D iPyME>KÕ.Khn=%fdrYF[[yQx+ @e62CƂՅHpk4=WwNu2yn&[^+I|׍5 >nF.NSHc:\9)>~ \{#AztYl­_b* 7DR&.R72r7k)bHcUۚB`],{cf׿}F, &Yd ,jlyYgjC>[_P'(E=tޭUGLl+" L Fwfݝ{{Ϳ9O:oe.67i~E[}}w쓦VC>ɭ6wwM˳]Kzp3[ۻ? zo6]?̝?.\ ^Ŧ5777Z}G7[1sO4[6|> #ykx|ր6:VTRZՑRSu{:͜8`i~@i |q_O};=COhvur" 4O@ӥɉgxoJ+|Dj ns | <53813N3xfUi\&B/Gncd~>SM}~V w-elEBhA9J4Pa \7i+3Ls-n`MŹF;c ҕ6{OZv*a(uJ~=͜\]T48$k =KrɌ74GSLb~ [ɂvWfd[T2?dF~1MUw9Q5{R*M͈3-'fiψyVuNE&$yQAR({۝֊@SGRC 5؜ T6]ȋ'gylj>9~6qëx ] ':jץ}x=:AkKYˣ }J 2=0pfՉ;Se*W p C>%FALڑQ;2jGFȨ#vdԎڑQ;2jGFȨ#vdԎڑQ;2jGFȨ#vdԎڑQ;2jGFȨ#vdԎڑQ;2jGFȨ#vdԎڑQ;2jGFȨÎAO^Q9i6w熼fs֚So6g+If@,R:H:km#E̢fwbIK4˒V%۝^ÒdIn)Ke1A:dCK*@xO T<'P*@xO T<'P*@xO T<'P*@xO T<'P*@xO T<'P*@<^Ӵ^_:>HKܖg/o@]w7opaIMhKmq\q q ec*K%EpcZWI\?$WIJ] :9 W[E*KY[*IѹURWW3-J5n$.Em+r|3$FPrJsh$"$n2bT0p%*yazrz ^'VWIɉbNJzf{Wp% \UXUWʶUV=\WWD(E* ikjJҞʐuRWWTڤ 5pĕQSP+D@bc.[W 0!5p-p=.T] WI) .$-=$%+[WBa"YJ$XJl'i+ %GUĺ?NdA;#3ӰRB?u̸ <{jі_M UIC ȫ1*͵ȕ;nߩuX}` \j0iyz/?sg1Z/CO >-*H8#c4@ࠅ @ɿZam9Q:Fxia֛&~K`?5ߜ4׿~NQ!hz?mݗzU p~ݧü~tG31i'2_SK6_"ƲCx:8\j/n Y$͌ܞ*ϼ7}{\ݝ_s7V5 O.SSVMƣ~+],{ɣ^{ɱ$Q+-9RSfEΫ ʾ }? &}c@=I\ak1+t,_C2!K՝s?hpX ?M9o{hk-hby;_V5Y8]E 0+9Kp/i*gkh\گMXbnx KӤz 0v> <O?6ϟ(ߒ"\l]Ӂ𹡵kt!nn=+\Mfm^ԘƼLa.WzK|@׻up%f s87o孠|&+=O;f2|{ /lN%^eu@55 ?/jh>Kaa@"6xelWP*F-if'+Y[yFsIRRy.qG+%4@hm=쇹{~Re _k+3<TX1t0cʠ 1T`&A-4A0ɵ Q6)@dh՞s:FY)5‚SjĤ5W6I+ϼI nhi~}h1j(n[VJXfzTF4\T*/α/cEUhT{Q1)8ܗІ'v4_^Fz.|LyDd9-b.W ^yD #E(K( )qB u) L7292kא%ˡ\3*7^Uιm5g빒* ~ Hv v|az׺z vo7_%a}oGcz _p4iKdy-V꡻ $]52m:z+8 tT^=tQSU5FO=7 d%hݿmEݕy07]AZy1A2=g؛p7TӍwP Z]Csy/W e2GO"i禿&WX+LOZmyfBք){SPxI Gި!i˚[OS%p ۀ='DD:R1e2ˊIa+MRTf0_EW?~8y69^q)oߎ{RsNrdz{ &Mկ;Avtl,m 糁JtwƱZʄ2a^!X&(X5u[9z$ip>t|U)ʲD* '|)8_nK226BŕeNF]$#)(6*y!^٠LeL )&EdAvfit2w |ĦPQF1rDFDb 7,VaGlG989 9`kـo3oS4h5gȾ7ǻ|Z;21ڸ8vt"v$\q+㧫7'Sw72"WJ k+d,3"pNSE]M=m^J& YFÀEnQ& džD2.Ҏ;OP: LȨh,mmψBRx۞ޠ7PIō>=Q:Da`0H-2HbEz(똥y`ia <* ?s~]`&IA\)J˭gD f- 熅o79%8Y|B'zwƹi/Ҷ.4]-L[u9#$pT:_2aScq23-.: 9Vsg4YQ(=Ţn,brK%bE)x5sf"Tnd&ndUaaq(X(  W'w׋4qv]?-P~Փ؁Z$0ٔWXJ%T #1SH<€!9X`@ϥ6y$%ؤB:`^e) ӁiATʌٍv8K*湠vq.- jmZb$-QRb#i)ۨe4IV<\7c20C@v4Hx$G(`#JudlF>CsAlPDTQD,mKC-D;a+e Q#mjhsʟABdIkz(B2Ѝg($^23[XA`Ir4i$53eFlFǛ.ss6JE2.Xpqw8uY4pK&J8#!Z넌cB .s3Y}#@X8P~ی㉬|C6pw؇NV%L:2]@iBJГbac1u,$=s,LisgEE1ŘdNڰHqB7. BYP<P / Gsla, Ƹ#I:yDb?D h$Or;BW+}N1pDU p0"-r n1a=uyI> <Ԍ6OWq9mɻ$)K:̻S@7ςU.&<>d.*P8ZT6`RY0KYі\3Ֆ=Nȑt+ Z)h)u +#"9aERijjG vbqrpk Fw4=̄x즱#5̗u=One72wVp8E6uFdt:LQ=ԇJ<%ZVNc[E 62(MR7rPPnlky)^{ܰᵸU6)gv0M4f@1uM]gP!-3D*8V{Yq-!8 YTȢB xR#fF#$JYF"1TlpƊJn0;e /F[.T5a1b.qF8@9'w91\jϪ!튣Oi`7Ii4v,o$UUv:|,%|#qw9 ȧ0UkQ9/ a+}!IZ(?@ޛI~?//{"Kǵsq TqIc3VdAp q{'S81GDb9QyE 5d^n;q|?k.kMX~F4fV#b:B<(dBJZbw1FQ0E4D8 "<.p)@!ɝxa)I JŘ1sfxHr >+0`" y 5SDcDpCLsm1gar;PdgNխ(;fR 4j=j4L.eH w<"T1Eep1Hg8= χC! _QN4BMX"bF3Iu;-RrI㈠( c>ɂ<Oztlkp"V E{2rO-0c,xHYo/6ˤ0YWņ9/A%ԫHQnͮgw SBL|) <+ Qeɛ1TEKpےjm$~1i"'Jk'gmv0t /bgl#d%`?`|\w{8߅q MEd;эMo31xfCC%!CLzbKd8%˥[{?1a?~}S[iJ;~߻%xCK`Tڒh+Ԃ3!&}O̊udc4޵q$Be`GK̀w . !U" [98T%s$jICk ؒ9r:NWk9Ydjy}m>6}xTy_7[tAkMk!\cO_Y!y1-憄1mCmoyG o-#<l纚g4:D^55|mu`[v>!5gՕdėwu*/˽]vWrSާTی(vܓfmO5?o˹x^?2MK?Q :ydZl݃ZKm[צm^Fd[3A틏LRA}ܷcڷf; wv۶5z&z?[{yKGKKi޼SbD0]6ڙ[piUKݾj! Ȭ48B-&FIv mob4H*V:IUE X"\QOv<(w&rcTI̓hy|%Ai!Xm_E8.e(h1IB81 L$xZĜS)ObmDVsHu}t`yljnK6KMRT0SⱯWP?qWaĝe85 ԅ a }Mmfޅ962[(M=Hq=vXD-"Q 11I&ygLD'}'v->XyoͰL ȬVJX;Tڡi=s֒=sZ>] `fUW\T$Rea~PL &׫siͨD82Glj `8ON)P"ʴ% E]6[34 2Fϡ< ƃjǹSaeehCG2f90\N k7Mpkřg>v$Z>^f=.ԛϏnz.cY]jrq4$#QDKքX/JpkJg(D{KIQ*@M+B** v*.*s0k<$\>'"q>}jbdMjKeo_3_,4gO /HU$O\&6V"sh?5'u6@$zp~zvQ ٟ㶫s@-#itf*7Wrx~tJ5ژڷG1#9ewٗH슘Syǧm3WFnu9Ȯ^;Ǫ$Q^|2Ҥ>G/jį4'f8h[-OG~|v_z??wQ]/8uT <_FC{tmJu-XQM;R|PYc[Me'_:]hҕzFaѢy\|DCoReU44QM(Tg o6ƭ6umnuolYf)i`y\*on@s u]}1-msFI=1"ʰo;m}lv7)l΁$BC`ed$E+@xx|FPKn0tdU#d^=j+h3D@$U NP#*e)XUXD.\B*q1)1ʎ,xNU^E<Ơpz+JcE0cG|?jz{QɸW G bw]zF8[TLRWRVSj'fG7Қp'jb6thi"ı׳I\F6|ŜUZ8A8|ǯ20##b\f1yZL@'Z-<g<`CXl1w7t-tkrBq}({6etw:?mTR[ CnJZXZE OL3Ub9̩:Ұ;6XZNEtʥ$JU R9EsrcjMPzPH#˥Bekͬ}" "?"_O\m8Kp"' 6D썉ɥz_LL0}73 & 4-l +i)2Wʵ-vW9ާV$@ \erhWZWJ#zp% =+$hW\*S+I*SW/U\) U&W}L{HrW/r~:5QIpzy6ʙ O\î*GWw8(ˍdPEt|C+c"ި;{KE$)hq7?}JbWJ=rK9W"_m3ld_|o/VФQ ì k0LALZ8AՍS(]³W&GSׯڴ|o^0)ô>lI`6XHp #aׇ*:E BWOetrxm]𦮲jg̗͹ƽ i\A|gH#"$Y*C Й5-ƘIDQ!hง|QAR"i  2O}B\kԜN)}aA-F݂Cg7.mTQ2ͪۼeS#CB廽X,伹hou6>l%konlB[N]'Ik|sTppOPB!$'J;lQdZ8!qzr--THƱ`c)[J؄ Us72Uaa1 Ua,>*(c_dp3I9ݝ³|RwON&Oq=;r"MMRAFDhJ4"20 K,\dek h8ZQxJe$xDiA-d )yT5؝,2%*1W օ$hTGema<,Fx98 }Ab+"ˆD;b k,1d"7Fŝ?_3&K%c$1Eh4"NMeq5D%U'*ȹ?D\jXWp鬳숋E3​r̻`IV((%fVs^D5皧:T,.>.KqGOmxV}YiGnU{phpxΡ61Vvz".kpz|NG%pz\YoLԣO޵q$28/`_ fOHqpb}QCCI&ICǀ-ӬLUWUUv[Tw=T}oh57]j Tqe\h4F=h,.`!Pg2 9J%bc;#z!:F$zXYLiJ hXh(H];_m5t6=<'ָ|<}~6S]tWuq ghpNJ7bi1ggظo!w6݊`T+ɻݢ*-I>"xlTq2.o7xeFS22AESTJj$Voh2iU=ysGi,0qHWPz\Dh&`ǰEQ.b%a^ty\7nrwlR/9^sm4NV*!yBZF%Ği<ƂGt1!I/I± }Bt5'-=@h /uē׍y9raW6!iΈ *P%8ha`z/~GZ _uLi'/Y M|P&Ly \ht4yjD2(V"Y!X*EIÐS3Sq:*H.e}p|eُh=g0z4;T~&+ٵ ;W蝏/0e{&e&4f-32‰F0PTagUYY&Iݙay%Ovȃ.ƭcV>>}ʧ]B%X W]6Bb15:K5<7gK&xf>ۇ$i! "<'YUL V?43ާw1_PŔ96u*\GdS}RJԍ`| ~d&xD]ooO#81k2f!׾-,`ZyLi[uu!ԆrՆRԡYQx3m`<բٚyH%&eV{Dž91hޚb^MV_2\aKG\wXx2tX2 ~C"5[iYmNXqa74bd2\ S62EA2L {ݸӊ94)tt(7xN;׭D0#(<#6>pSq'v#wة%sm[zW ڨf{Ƹ+j`&k{nFLTձXVUhNbjuZ[e00zC9x"A^H c(c5E)0cB9M(rneZGEdQ0A[R.k%%RHDcFZ NiA''{4q_}>y A%8bG>~`. X+]؟u2L^V0#)`!m|̓}i`!u/*=FFj睡]Ž 1F: ߜ E6vi*Y~҂۶W?{7.}ђ+}SRltLjiv_daC9-Ѯxo=z!sհY2W(zf"yp (7\;0 iIqyqq0"e* ? @%6G=pylֵ:z:*4^,@iK|,XMiR2ɃD(n^4P Z- <DB8NMhXX Gt B8mJ"w@|y58QTxGDhcRɜRafX̝Vc  B' Lvڟ*CPYSQ{ º A%T; HxUU$: ϭvx"6p o!D"q4"_>ښ$* &lIŜǞ:M)Jɣ*|0d#;?{\{LHw.(''.yD.4WiC $.>0kujT4|0].͓AZfx|j|}xOٚ1ee3v9Rb]JA)̔eÇr J, u^?k~*aYiC(XX\:$U8\? EB-./bZUM|u~`dE^???}D]|ǟî7Xip8m}~|<>uW]CMT=fY O)X| ځﭕQ)}PT2Nҡbל]& 8>WYYAD5Jkr7R! 0a@Ri[n-5|-|cPz;Ѥ }1h;.2KI;GJQ&߷:'e4#W߶Qg Efx VL&A 7;/!i׉wLjr rBL(dVh=".IFEo X"m?:vu]u;(УW2?o Ztڀx:R~^XEzvM(r4grFlZ z}a-,P8a$bcP%8Ƞ,N ,4 Pg.D&nkd~#QHYu\0k;@-"$b #1MVHKDit[(*VL'z/:IY/2p[kIҰ|j-)@`w:1*)^fwwIӊ&Z֡)azI@h;DZg0 bxOuvE͎ݾAhC˼n[Gw$Ayv~H5O{sއGE)(;u4.ǹټ6-9k6omIx3uj缆f(̢ϑ |K[om1Dz͛wT5&Rwa#S={ʃ/lILJ Տe5;*JZWx ]?ZN m h@vM=a՚ZR}{m.ke-e` BP5;LjnDh,%pMA7kWq#2Ч䐶XdE 0{`7 ~MUy,f$9{fH%X ò/Oe5].}vpa^ȅ. /Vo91$x2I *n|7Py#o|^Qw>(xzĹ'{3zݜ򆞲:%3I*,t~u޾لc/w )4c`lS~?i|~xtE-N|x:a4[?= ވy6f8|r>689%_oW77tچ>NóΈsQ.d`Pՙaf'vleM;QRQdCLk mӈ'+_]֠ݢ"Ce[ϼ^yjƃ 埠qP'{!gAYz* D؝MN-HKECWd*:Ghd$+WRAݹǯejTOqb*( Q* {3tLOyAzwc-uVQ0}0 8.ea :8XZ2PLV/vgl9*djVTAV%LmXTJxV5+`\WVIycﳟowbݵm1T%ճkR=_<vRp(¡_&-˄B|nB!B!bW:v@2+;WUܥFG[WUZm\BJ kvX`x|mWU\vXZ)qJ4k+Ԓn)t~?>Of_mӣ~K|IPJBCZv3J+ͶchR 1;8CpYu*.Z5bJI0++]Pw4+pUEvRa5•A-!b+A \UiQn;\U)Ͱ3X;"-؝*]*RZ1k+B㮖'}@^G~2j;{j,WdꇀXߓ*>Z|G ejI\ĢmiqUG}x×{&!vfcoysƾ_pن7?`)x۽u`oT]^m0L(XMR>tsdE' k_+uKҪ"<-<VaYzI`fy(͏~ Ŭ 0Ǔ|) ޔ*۬(ABv&dFdUv:D$_4߀& -G(ER?a/OyUàDwMYk+D1ߍNO~ٻ^3e6>f̃RPxa{W7˷mb|׆UߡX4Z~wRi>A֜6YK *]@O1Yk%Jm]l%Fd1C(3Y!Yh2lj&uLR]ہJ->+] CoֽV@նR[R`:T~Lbm''ѹTwie.ƃAVRȂ*L)SyrP9 0 m9L.Vo){!f l$5|DY`NHغ\+q6|86Ѡm7ӓ-_N};)* Adh c W>4^J@Kdy6!oRI| j1cm* $CO6ZM ɖŽRN}ajyqfXL36Bjf'kmO+3޿[5;5N?Ov@OWYI(66,Q"٩4 MRY,dI' Y"5a"`y~ sMe( Io'+]Fg *"ƈL;%s[Pvl jӀO*,) BjZT' %ZPD)t&edS<,07 V21墑Yh82FRIDv3y0NELR-L?6ED퀈"ާUNh뱵 WJE ~f,^c%. gى4g \1*-2΄lRȞDRc0"~zUi77mZl%q 8!lT F7*kiT^I2;VI}` ep)pqkvl b<|{Yϑgs4ܨ'|UoKa_ApCa '+,kQ"t!'c~Hs֍4]tJu,4td$k&!u:օ:J=>QZ\#1 "z[#''(PKYH__@HICm`1:ʂC $CBS+qΌ5ަj󮝕*Ӱsggǧ2NNVYKc/D15jإʐ,y뫸jg*CViIm{v}ҙ!f[=a{>F`r0pN3>w'tO偦mY>]^̯~/qCYV/Yzgz3ZB⋄dudz\FΩ#2Đ' J$H9u\!BړcRمlII21AY#1tk|K4W]x~9*yZ=)ظ1ҩW\d1.ɪ\uf2O+x9ZkU ` PIJRL Q\ k02Y;m >kmKE"Ra+o} pL6ӉiL@EfL'M3tEAx)Ts:Vf] Zg $`mVDXI.:~[Slj|X~BP^-ad)fc⷏21#ΙN:I`㒌mU}fua_"NяW!]HR܌ϥ+z"Q Cl YgHxx4=|ebD?>"42.D J QǼ.#^:PݪTߺBi&8qGaE5SӳqGxAgJٱ Wv :R ZQJRlE0VX K*(%*QXDPuy7b8X'#"-)hHR;ٺl-IkLJM2 "0D9V:pHIƩ^s2(k4g€w#"Lv^Lk=>av sS+[/~M1䉴H^Gg[ GrLA@0*AvxxN&g=+g@y2 boNSMEwH2!"aQiDLe@&S麏_cc^Rk圕SL!Y9*: hI#շWujf-C%C Egw lėZx5AwgLkD2"135cb):^d2Ѣ(-B!1 `|,M*$dgB6hTJwUECDEsdepw¬1OМd!׿/./ 4MN~ ?~%YjP|1X;S|0'4c@>s 2h;Ir%ΑTP%E@/)m)B zs~MɿS\AFv2|vx^3{.mtv0`SPh`dobnhx>~o3퓃&PmyV<_?ٻXdzP{b,C4e43!X&%4~u=/}_'L]a`+h(gڮXdg /GeV~LZo845!ULI{:YoRXXC p%~ *,譶+j[Ǫ*ʶ*`_0q$rpj*qy|r=ai1Ə=֪Dv- Pcc*Yl..e zZ:kB( 0ow]ҋ~'Yb;V@IU&eI}>>52MSA?2 q?c Erڡ`2i.G?* x*eu$J^+g*2~[՘xٸ\&Q}sZ[5XTqӮ}N˛hCݩ7UW#WKzp /?xBjY /7u9e#va0u3UdSs:-sQ_hK6-}}gJd Q8C>bAv3xo;MP7ިHDb(FC"ÜSY ;,fcp,GXXR4&žx8Kvrsx?Y;y{98xZ;a "DI$YͩbCa m>?r+T6Wpouv7S-D+Ge `xF# tTc c(P89Z6`[ 'ff}\HsdIB̰u%LAKHQi '@uA_J7\ԺЫPs-w|n*p:}/LjS$`q@Qi!9&( ?!ZellG 4P5Z`(3*h2:>Q9b0XJFN>:v.YY/`Il15{lW'[.-dB\"WtSIO>,+4'%L$ ˟ ұe+}'WC8eܫTΌ?N# EJ|t0:¦?)kNGArq|zaԒ-:èdZ&/ @Yț`/}V Dÿ aA)z]UEWH|SFxv.reK"x φu# f.9d oWYYABp{'~U^QpNGq|_+" 8a`5fWڛp4({THѺ"3do?Fn[P6WHCsKFD'@]`6g-6ip /QBD$BVBzLkˑ"mjIBw'Q%oi򱾵1kM uCO!5H80`}DE G9]L0, 6o+6_W0jE/x_538nT܉bS<:FS rV87}0YZj9-r͙A6ƒEiCNyR|lF>ThNbjuZ[e0,R)$rDzy@żQjE*0cB9M(rneZGEdQ0A[R.k%OƶQkF7 lҎWn;w}4rFx h:C>mB&{K֠ qύg]8y4L/^V0#)`!m|̓|i!(; i&ɍ[CSJ~ 1F:< OjRz\Tknl#o8T`f@zsr9+Vױ@mCW|ʥc2geo \޽i4| w^Oⳁ4vV% N~ril}b Zۇm!-=EuU/2yiPpg+\`&"AKjW \gņmѳ(@(7WW\_{PǣYgo5*:z:*4^,iK|E#)DYɃD(vܡD A Qc(,<\KLRԇҽEak2[;0׍T,%= #?NG.4W8ī]an7JC@ Ӳh2wV: h(`Q !8Xrj4!ba5&)g1i )_@skPkau;"E IE$s2Ja;$vD0<jNwX,(XUV`컘r6Z_(Y@ƫ"} yn3;SY%qĭR-'Y㫁VcMkm$T M ),9=uNGk3^[/lZHmm5~Ȇv~~:Ӹ{jOHpfIr<4L#0 IobƝB:۷IsOʥ~R"?#NM.(b a2Hfl+r |Kw*=3 ۿO ^셾Ϋgo%P0 VVtb 2I$,g+;)T"gPLl]XFФ$T&υn<\e`$(U`rq?U1 ǮJN{nRdC0T_/Q} g!at#a8[SU:+'5#Y~.+|7U_̓쌘 7Y9T_휢ڠF78=:3!RoΖRڧZhj47_,CJ!6ٰ(J _R5,%L3׻<_>|+U.7\UōUZ+6](2إc13rl N3&fX<`pyk~t1MdR&*`#)E|ۨه{1~Be6p!N#c`⍋Z`$H1!@cP8``,N{lH͟bK{xK* $'Bof"`Z`T`%FYKV.v*ONkkb<9wDyrfmd_UԎG]݀7_tvVOco`[l_:W2=?-bq[b[G‰uXTgo/goNm IUHB@x3j{:FY^1$}gގmڛ4I]&!֗^@fz/<sjM8;DPO`>f>q8ӆӼR=M8{Q ;li{]8͍^jI~+6ȘbLDh,b! lFŘNtOw4)(P8a$bcVvqAY,HhD]L5f&Kb0B*CX ͱIDk1Dk45[!--^FΚr%o~ NUqp<70ju0Yk| gGƓp'bW^u&O]Quu _rRit7z6KIy(cKUZxȀ]L"isX$(Ct^^^= `YQ=|4 d%j_UuՕW5ϕ\R9}^g^\q\r#Xs^bB3[u4b|cE_t{#sOD=s[J|"URm+&ωmHŔ0.s&5 6)udXa`i[)|$yt[Kj%7hnVR8"D~.pϮ mc e & n A}0JĨ6xRj@6ݜ/_AYNyc&~6_\¡EIT6J!۠LnL )&D,[Nl!1]PQD1rDFDb p7,#i W^m->`ء{6F0% Z]|>rGr?_gZz,+=:  /*?nh Ƙcdto$i/9pQGH(򹖜N)If: j!n=y.WdOjL3Kgpv `g~.c=  ·~tot7_ I׻:ˠŴgc3vev<93HX޶U삎*pZ/*ѓk!:#ufX#d[l>75Ijb" ])Bf)({Pu6>]^_ƣ V-Oy$ ɺI$/&gv\YP7m&AY^'ӃfY Cfy-gz㸑_CKTb 9/=X K1kώ6F 7 :Ω̻RPeG]W>#K@D?N,󤈴,MPTѡU"K !؍ZÝ2Ahy@˔CɨYjͣ}sD47wG/onW1}{PU)z~XCm0fvyIW='¹9BgT h8,JTXG 7-&D톓N?F=;Y8 sIP4GZ7+-\!S%ׅDV >Ӫhw{Bvy!dOJ<xZ~jߧ@^rA4 Uvu%ID̅O .jdR24 k&Άܿwl_I@vT%9IO<#jKXf7r+nK뛞o.y-7Z y79#]es s sBH:xHT(\ BBƒ6gkl! w.>rB3sMNR) S' swVJA jZk;!ܷcPMWJcoScJuuId,*'Wd5Y!VNt1yZ [ ?Դ!}hct<4b0,8X' euJ6hvh|6D\yxS]_C^svp㙦wLnˌ~_+?~x~w0Fq8;4yrVF'cPĪvϏWBl>#:X֓=nnn6Lg罏Xgї@<9=Zp^.UսrYK՚IhFj>W9+GPO6 nI W4jX'oA:x_~og'tοn?uzA}f3?dҴ^ݖ{si]c 0 l]OZ{{?8r1i[|1<; dg~~/@T[. 9ENciL֫.J1{A]H( y輐Tx#9 '"LIXCPܫ#3uVHfqyP: vSB1GN.R(B(lSp%QZ჋H&TM1]}x.uaNX.,<2ys$1oAU݊"W]݃|[ie,K9ݧ˯WT4UtQ)埈ܾ3 R$:d[$@yF"))*魎xtma; 2׃ >[VT("J%U CRaR3q$绔3Wo>ݏ$BW]fGyr|UWl[{%1䘋Tԛ'DÓ% >+WR`@2NKݫt4}~~_e {8=vo߀yz'WU{]pL 3| $o: U'JM%1IDCTQ}>[TdJj9hDD0 W?h#(f`EA/ Mb0)db/*F5V5g=0v !=0s#|n̚y+=U<z=ϖ#'\ڝC[؀P:\wErR (h/Pyndqm8%i/ERh2!3 "B]?a[[r N*Zͺ1f4=7oo3.C:+ES;tMV?!Soy׻ݫ<it~z4ΎںyN7o|.A wևͮyi3G4Nx=]X+9{٭~-9!9M9Ծ~>aS2E,#/ ޡFo)6 ,te}X]ʔѻyEd[Xb'ۊyz«7M2]\e˛qu6K,`*Z)?J (8Meap!ykY=߶u#vmbթ.>Jq"Ov9iٻ5 m@B\ Tk=hP"eSB${;AO* YV!RcpyAAnT8! h ʝ7:] q(P||~~F哠* B0dv)+'PJbx.jeUL)6 6dR6(8Қ|$^)6"Dj 7֜Y%Fߤ!/'eZt龭7~Ob?U.!P7U:Gi5ӓJI`HBiTօdQZ,m,vm[~+ī-|B:"OBIE"a'$4[`NZ(tVURtA) ɊLAM:BP^ mRRWuD"xΚX雇٬ E˻R, 3-Acj%kB㤳"M)'}0>s `mMKOoY KhAXAiXTTa(M jh8V]8T`28.֊@+"C y0QK!e`(h8v\3x!CE4]BP6w_THAj̑gacLmr"K]Yg7X[*5=EwC66YMP,r t˰g)fS*Hk $FYyԃy39h!:k\8k ']e~TqDM@%$ &[J6Z#5roX֪eJĮJȁeg!Ozs2>Οz qN$Nz#;tOmg !jrpu2r;0rhHƂ!넴IA"S%|yI+489Z3r,q;DԡyKwx̖: lt> b4JEV: 9bp1A r|lvG<{%Ұ3SL*I9Z+\j]A8~+GtyVfҚɃ=~xsVv8/_6[z F"$mc÷}eцaOEy)$$SLd%s/QTi\.Z*v=\rhsQ@ٕR2:H ;֞8=c;6ӌB>/Av"%@KӴ? {l1e' Ұk*[B$eMPEe6kµsu|mjch [- OMaQEuMV,/;J;vkkv+]dfX(yU2("j(j[%ZQ5GIJId4C !32Qt_mcl\א`kkc{s]8$!)1H.A߷)ER%@DP`p03}{[ ('@q =r`Ti f>,EL>$bl #6>ʈfFT#vx!cC-D'wH騑6@9aOی !peIkm{RV`D 6Qz@gN X҄Ij$f.喪k?j X,7iQrG^f^/vxw8uY4pK5tZ'dh% *!}bc>w< 1{C+Tۜ:g"H8);$@#Z+=Q td:@\rIcgsu쒏5$mP1WG4FڄYkcFQL1f6,RePǍ P%w׉ ^@Gs+0XDcܑdbrX+_"D4wꂳYwί0tRѻK-iy<8xpp8&- b< uS1oQ1ڟIĸvst1\ɆuRhօ\2tdzQUZ`kc(2Bȼ)LN.mCRa}pt@(Ю8aN8䦎0JR]&U@ԑupA}gngڲ`Lө81'JJ%HkTDj+ S%4:`b:g כx!Rg2D'HkƠ}HJ8%2b] eظl5$N=aNұS:䫅6}^I:.OEENoN,ٮdRrw[9ܘhr*upzk\KsEn&0cť\;˵:i։9.baQbR|x2M{ζfbޮ,G,9RHnYT t{//s#:ILUΘwZKBXs)hҺFWq/eKdӳl6,Jz&a5DAVAJ)ոkI4F=k1KJw2ŖuJ,Ʈ>O9)5Icc| vx(w⾛c9MYpFAH,GŴ%xP (bHSK#nZijN1‚XqAwp(@!ɝxa)I JŘ1}fxHr zhfgָNݴO9(@wg[$n(q{5_Dp7^6/&Al"rH#1`xnLPyİ#I$FJD(L$' Я֫VCqH&P;)Eu[v} LȚ:H< pUzp#/Ӹ\#jt Nk"pi"# X^DK7sڙw;寬00wO0q|>7= /`a2_Gzf_aX)=`v_b~c4 Q_f0 ?~ S(ݻ7z+-ʙ*WiAUY@$~>w J!Dݰ?X ֽ?X֙+adf^Ⱦ:D׌K3c94X`0KpWI7D3\v6L-Po0x4/+OV4C%J%3s&ɻlCY%RYUWGD@u_}2Hg&Y+8 àž_k4?֮[<b o%J;)gm|J|gmo.pon8j/>\vX>Y*]JG TW)ӥ*Jw4k秈>}/~x`_%2yu8h)*,B7T._T>US1Vd՛_Hb~qt+y#:qt|jbGAzh=oKg8=,m|>I3C)@<3D]uQeX;Zf_O{TZX@U{CeePo'Ui(DY%|`_bSrʂ@Q Hs +5c/-@ [6rF?)yΰWnYE?=Ϣjp8Ǵ'/$(`ŀ%OJa)a,T+l;ҫt4/t6nLZF?׫6fD#֒زՇԘ$}vkpO=ʭjq }\E^+4oMчK?˽3/@l#˛WCK~.%0?\,ܟYJrVNW,Z==…-\H? dBwVE.^_v>y7Y/VRH%7,6Ė/O(l^yK{xwjsdXɷ/^}Sm&ZҖePv_Zf]֝g0/rSr2^d:G߹ BV\-BWVStJ("Nh]Q$mvpm+@+l:]J%TGWHW #DW c"XFd\vtut1M ǜTen3@X-+Lp{֮\JBW hR{BxGWHWDWX=:X]%-^Mvtut@𼨋\BW7J-"]i=[DW uU+[3h)j#CBICWxˮLjRBأnMt7wBKx %m(ށpGWwzLCx]GWX3Tj -MD]̦Awn 3vyyʙ! s6Pѳ Z9)7Q1dz,9Tpg;Pc6ƌEǸOGqz#u1,43G0jGgȑ 9 Ƙct2+Ї>1E>_.;y1U\vu<"'V7^Q+ 6gF\t.Dc>=]ާgewJbTG&eΤ&X N&JECP0-{78Ŝ9s˭#:F ^BW3^ EY H(򹖜N)bxЄ~]j.qLl,n p%m1Zʛnb't&ؔQU k$ZCW1DE[*tPJcB6ӱ5tUBOW ] ]q(-+`.Jpqk5~5~!ҕݪEt\BW -MulHW | 1kv tJ(W4֎@ 5thW jd[j?D҂J& Ǒ!mvЪRnٲ1\ "]W]R]ttu׮B= 0g5tJJhh:]Já+"mҮC$D\ABW FM+@f vt t5o]`c &ZINW ] ]1i]`hk*%-t{4ܟJ;:@qJ[DW n1pn1RtJ(YGWHWUVB)ڣ]%\ZNW C+n]CUZv tJ(5JiNk]`Fڣ]%5kW hR{B:C+m+[DW cǻ -$+J:z2tKkrucpeXaͦZ_ S׋?.\"ͼ8g֨\AA!+(ݰYly.)}|6*c3=˶(U/{VYHE!5!X(-yF9XAR鄡Uoϲ$P+q&aeIzTyMKWc(Y,Q ~cE_pޟr]Jw*YjV?PJ`V+u~dώs 2^ z54@ۭ7v[ňݮ𳬗]l0YLm5ea l&^' a#6}tIev}ӳ 4E+Ly}Miv;nd6q.LDd7-l+ dIWܡs7NG$~n[JO^_+bX,۬T)4/G۴wmHW>V ia1ޗj xm[Ԓj`}\u1meZB\0#0Nf @}5_w[aE]Vcm=TբRP&6 o!{ ՍX!oi~qػsa9Cy_ՆT1QZV/qFnS-kLn `d^aTsA]>_OR( kP=2ٞ)>;UO~Em[7syd ј6P@e'zgjF×.F~>{T[8nӰ>|6}wN6ӽt3\m܁a}.tsrVCKk$m"Q,e]2-WV^=7xf '߇L߭T`8k;eB~5C/24/\^O?^bce_Ų[דY2ѓ􋌼(6^/1I XCg$vk:D `m%U>Ć*":DWp j]VʶUFiXOWgHW$!BTUdW*5t(螮ΐ+O2apug+DۜR(Y/ϑet-VLy NW%=]!]i*ңvlTw טЕUF)3++,ޠZ]eLt2Z7CW|O 8{7W0]É78 -;]t+ձ^}<-JWX2peg*պt(=]!]Q]!`Mg*լ+th1mΐ0&KUW'_0t2Zm{OWCW+Mx :CW]+5!ϮΒނey,3t*h*4=]#]I+/ U+xW*ղt(Q=]!])At0eyԞv`F Q6_tu>t2tDU:B f+.2 `CSwv͠QyUF{1~JizqA3]Oq> 9<-UA@WcMJ5Wfl?]e'|S/ sZNWRUOWoBW}Î:&jb. 5ToCFYT-TltJ ;2ipԦG!A2f|ʇGedĤQEK$A14C: *tf2,ǍE;O3|ŖȖw`qyYeOeBQ^&4\9YzqZ_`bz3JU0?Ak4) '*U[UI`.ǖ _Gߦlkո8YY>/iqCv̵Ug|oA;öS|³!hyoPc~?>}oWF)?N>GlRcO~J,qh#?mŦ5nݒ\֓ s@y,Q:J3jF&ikw!dHK8-V!_BR!T6F)D; Dǁs8`Taֈʯ4z{b8/] 2 E,e,5G?Y~4[3r95~kkc;ZBl<[6}(+(%=S^c+ F8NY2D !9.}xͽeFU xGcauH (ROkB2F:%J1HnTif,g73Uqa18 Ua.=*rf|`~Q0>a8h8sƎ`AR*("PTJJHhA"嘬]fG.ssLL*/<(jJ6r ta.g7cmhH̶v1h k 9,AFoh 8"h$U~G*J7  C$$$FNA8hϢByI#^bXģ|X n>E>N31DZ 3g<^Y戡k~$C<"3"98׌)%r@dILQF4ETqIR c,"ϸQF E=*i U`݌:ZWu#y¼hz^yYzL;I, Q6,ӘR%pր׌2kPƞ_[Ç%|l`>| [mnk0Ǜz4'K6^YO*ppƁm?VVVM\ >gaCd T]WFqv{.E,74͚L0 =AUZq 7ٲP!v4=:MwlA]Q F~ ȻޤmFJM$n*4T&E@ҥH''ud,3J('V*A195Zu&Gi0ŭ5 ǔ/6jCy-Y~%$VaZf4tP;z]˂)Aqk0\Yhʅ2&$Ji)b `#XhM+BDpg9 計 dFD|=5G8  VسlY%2Eq6Б#h+B"Je$4':EFa=㬎?AՎ:9GoKh-b8[ 1 A+SM5+ZcȹClܿ_<}b\`B $ ]aC[EPK`J<RxyZ~>uRU"5RF!t:h >$7)`% z(B8OUH%KҮ{Q ;gN60/:Ÿ&Q-04$BB9NXMHktm垈 i+4tkV("*|dhr"8aV$ADfIK>-R"F0DDF><ʅ:k^<ЅCCлYXmT)jJf_n\&*HkNSѠ/>m5W('/ GV%)I~T'#+VD.OfS3Z:5#f1Dp#g¤ Sd!$m8h+ xdJcvS4:l_r=cb2,dHj9-D浧lzċbp\z,&9dbܗ~ nם N7ݡڻͯ,Ǽan׷[@CJz(I ,!^\ƹIraUQPI.a lY##*Ai:᭶&ThQ)an% ɓµ1zq<i[l|3ϿYS,&o<LUdqeYS1/|mbBY RI7T:9J!YNz K^BUB⒢ׇ ă4)aӖDJpO2cJ//2Vy}XWk\@+C&-Jb,^ 蒗A9F,4?/Dž`"R9{46:}i Z$WOrWU!F7%vT8IʫH82E)OBgz8_$@F}`\$NNl''zxM i[ o,%qHJJ<,QަZ@ Zcvh]э[Dq4;Ej8~o'UN0}mжՆZ* ½Gt%{kKyĵ˜K%X;r9J%bc7䞧: $vęn*fmAj'Pz}XZ+ވ1"H Y] icjvQY*mQnkGh9B0}cx|B;# S #O.(c8!S|{e XH"CDҚ)RR8)S]y[Y.`ظu6-yہL{C HޝlP79NͭnPc}3ڻGyI:S|Xg Oє@Id6jOy-eF^M1+յJuBS-ُې͍)^S}^S^/`oC[oW K}3kmjdRH9zگ,ٛoY8=^DM_ċfAkr0/_3A)wi(`]+!ZA"5Io"_UVָ$FxoeK +[G]֓_lce-Ԕ'o&o`qT( 4y-5vsaĞ:o~t9w,L pdChDeS5#jD֥ڛ[jEuv7vuOj*xH)Ym8jjmrx/:~;mZLEYoU%D9_Q+7VMxYRA4<4+מ6?x 31\%#d_Z Gu9e#ʀ”aƳQ%(`ŀ!Kҡ; -ΰR*`qW{ ){U!eoc3.{^Jə>i~ jδ@Qݨ)B.l!stUB r-xE&WQ<|P3@{HQh RH$6b4$29) g13̀c!<:2aç]V0 a,?ݟc c!UĜ()D@Q LjN< )NF3 e&[7 gpt_S-@rkW+?L lP9 YI$| zRnw+tk9$ݦ]{D stAK$&Zk%D)B9\0 RK\ 8MxdT#js}}#cm>b48(7q.A OX԰aq扊.z&@$#h ^AoOvgy3J+!-W?f:e(o܍oPBilˏb2 9@9ԧ7*vU?_iU^<yeʷK1~Mw.G?n;^6FrE520K;:. g!gѳy }T([EM"UwڰS/bYj=kl/aZgż믪_TF@YaSSvо4kOfA׍ |Z9e^uB^X??`7fa Ӭ ;z*0ۙ:fvt6K]10ٸ'=(a9-r͙A6ƒEiCA (00ޔ0F$VU&BQo(HpJ7K.)4/FEr r"VFudXDcZ)"V"D4-6rU.n:v⍱{/M_ޟv)CpHRZ`3G渀%rq{38LrJXH3p䀀zB@'fndFZow5)QSȰ=,kɔ~o A{I)~ -SޑV=VrϼV9`pED^R3pIVjw}ܸv#w@(WwW\B&#ď0W#^ny`+s?nPnu:%Q`H[{-E1MiR2ɃD(~V4`) mn Z.]4UƘbj A%T; HxUEp&: ϭvx"6p o!D"pq4 Ay^aѴ!}P!V41XgcXN*<98mOV 8HUxm!LH ;\%or-C6 GBϩM4#:Gc_Z$BY6XD8IElt`)q<3.|yt0"FuK[R"Пb(b+dZ`@RN!c)( v|~P΀ Λ(;+:mqkKdr} g ?JZWaC1cn:o&&a7Y>Vyvf*FҬ>=_*OcW%U-mꋍ2 8#$L'L.ͩՓu?%糓拗`qv1SiT^[/vIQ[2/OLXq(^0zS)E>_iW7fyRbRI: b|.tyt4[79;{%h}$z휫H.z-=@hi$ua{2u锄`p6/ Q4*`OZ7rn̍N`;7o{Ϸ_WG_uGo_} ο`f먮!Pt!|e unеk*E׌9f9~`E@;r" JSO_QsyIN&Hţ.|Gt \ĕ,ͨVl1*w+C!˝I֫V3ېak ~u1EdR&Ed)C{NjOF"3i} `CqQ R)&v C} E;uis^_!iמG5Nڥr @rBL(dVh=".IFEo@ X"maj'nhGxޭn{q7WW&V`a(R=e{8s,MƱ;,%<@R`kY"k5aFOȽ[jp0c0 #[/8~dvGR19ژF7W2:?q78Ŝ9s˭ʼnuXԀ}cY&&VQR 3jg>MLj9+j$~Y ^qOquU>!B)Oσ>ct,b_Ir_ds]0ݴiC&Ց)k]L kl.S_)d];yz//Rv%E# nv,knо]$p|tѺ?Go>Zʄ2IwfqI~BW@["Ѷoz`,~H),aٵ8dw6)6*hyim#ɞO_*[%hL%L=: [0ޗ`-Ya0v: e zu>f 2u")ZxGp62g0c6t (VՄaTέ{CZ&6jU(;>yx}uyxe[|s@M޽okvorbsI{T`sl Ṟt373Sәljh7FגJٸ8bkmˤ)|ΟVz>fv~.4xengea?Ye?_7#T÷z:#JdM zW|!Y)t7vQ׏dJ.NnUK%h4Z^2r)f oMT>/c[I~k5P $ZpޞC&[=g1ֺ29Z4m 5:r=}jʗ  A&kM=D]ܕ)aLµ,[t 3q5sP mcK5cbyUWUiԒϯ} x3̇fɬ[-m*F)]wA#U o2 ޡL6BQ{L>R;:lnc0j!;>y oT#FSH^@>„{Z]!PR=2hg?7'뷬bjS::xyAL%<|8BJ.}zxܜ4A:^|^E#i9ՑJj MQ F%k &dtsÚ4 nN'Gdu9D5nj1o3e|6CFUyڄҒLH4s"pOk;k^Vj!G]цΨ|WOG^Ic `֖JbЀQ:K+z o'DQ9 _ E*USFr>9D9u (P .~u[k/[/USğKHN+ƌU]ϊIkB /QF020>סt>o:@C6d IWo#fVW 2i(yG aX (ܬƢ. ʃJF<@(E&rZsȼb|b\ptiXG(^bF$(!4_15lmBddd>/e,0PFu5w3qnQMZ^Dw<}X [@q'kWB,UǑ+b FOBfȈh2 A]j~LM߃b:EjC$*jPK |T@aFH H jcYۚBΨhv5+ڱ-" 5D/PTC+PS:m1Ѭ2aGyB4ΈO@Gm6uf 3-;kʐ `?Amj 8vGyQ9O;Ea2,Ƣ c;gũBPˉu?&"Awyt֖Esv4YTЀʬfzJRךx魩W`Ǥ!ZvH¦ȃ E}>, S0D7)2ml@?X<|i;d8Nҳ֞s<_vcH~V-jpҘD 0r";f-$V!T$1ЅΝpf,X*Aչp]xuCP޹}@D(GcAAZӫ|7|‚ay~w~wP/x}c,2\by @ɴ/l~<#ϳR̚wYEo@lpܟf) īslY@.8HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !@HqN\!`w{C'4 I Ȱ(g $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B I WDEԊjH KjoH Xɓ@29!! QAH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !/ P^@Q|!\OBOL X9@7wˮ?9}nxu&- }|vߜNÍ XÀLGcVS:>FKwzqї|uhe?5U9*7_裏or;"y$^_|1uÇx&/_wpo7ͦ_y;B%P:BΆFS.РuPŪF 16c(II[|p`yJV(sxbםv = 5wNs:w1|xŲ^rIuA|mkt ]"u)%~߽ݏHd{f :h  !2_SgoE>ek^0pގgNjS(=^Ӫ0lL=!$6uK=U{B/\?\s{-V\ ]x3 "-~~yW۾ڸ5ru~vuuW.HW/l::=y_{^ _߹C.omXUk{ ѿ:s-B{7bCܸUȅC^G[,:Zmvjϵ^$g1O|?$"tbĽ%U'->G;٘[j&_  V{K-ޜ!R (ZtpVE 5F4Hx4ɖ]s^[|?m^]=rs6,F|OZM|!d֯aݱ}jV,ߞڛ:)z= r;˃s|;{o_\6y^);~7-\[vbE[4z:vQhW kpf7.>zp~2l"iqxW__wKGKkrT*3I`BpLR!tNOZiEܨUѿAS莡En,\UiQ$H,^sޕqdnWcMv`w4ѕKBGJG||Gy-Aĵm-z{;fmdb܆hߝ(a|2]|QSmZJK?]>z.U 6W&ZVdU Xg O2N툠lI.)bz*WmVLe[b3Lz*=׈,lLADm8ܘ J* dcM֑{fmV5nΧF?M:"ݩKwz̵DV o#i>-|=pSgڸ eTX"'IU\;kkJ»FɦH.zr&D9RP-V-= |A9J5,63n2  %k͉8__\<24.emEt;a&ˍ10FFh*0SP* 0("dBdٺ-,@:XΕQlJm!)h1BE룱8;ir&ai4ČfuKX#.cJ`ra'+GaPHRhH&p9E: cm+8xͺ)E,Y3PfҎ NY\7:GaxYae\aFzؘ?/şs_wƿ>jy)LD+2g@$[+,De\18r. dD\7$:Ύs>K6;JKc%{n reB*1&a ʳOFYٕo}Wg[nS,veϲgYrddeB1t6tFKBi8Ll Bu&dgBv&䩚bDbrq9U8xJ(<մUXS֚ tT F/\r WE5$*yZ*4մ,'-KȕZ&K\~sv1^ÍQb~1Z( bv~Dr:4:[>_NsXz`XψHZh KvY}7/eةɛoCǏ6ӝ6~;B#OݯoGMJ+lY:.'%][x(vIHoskMf6)6N9dlRN{3i؛V0H>gE6~2ȓw#W}i@x{ P(A942$(AVj#J4 ;cvQ: [f[Ǡ;`Nq>gvM]ޟwyᎹ4%<٠fNʖm5Axb4ٹ EQv|\C"\^NS 2C"iR-dbI(gb.["Q H ۨ1 )t;;"=*U /G嗢؁-w C!^m7xjA' <ѡr!\H*`>s+Mqf QWh2;O섞yX5dV*kK*XxFD= J<"Uq h޿< 6FGF'e,^IV뢋a)3E%DWĹVE+a(B]i}=X~^;tXQ8:$ږ]?R '^6 LUoaDheÑ!\!S&:Vdт@XDyL1[zORS'Bqry:o㳋9=n4~D&dz ?6=aW=l_..&g!jԬ~0g|7GaF?÷g[g3MġjHX\ÜZZcsHkΘ`{?Mߋ_~y#2$eB$д 4?8};#Nߌ.ȋeBq}[?<_ÿl8IjtA{e<467VW]}.kߥ+hr!@n`s'W.QA?~K/:vz} [K~8a\|Bw:W^"${;_vufo2G?] |sOn>ͽuoF1)àea/?.Zl\B^6#1?Dyċ?*zK!e5ue`xp?lQ8V?ꅑwζmc_E3Ed7v}0"Kd7MeDzE8\Dr9̙9Chݨxd^+GuQ7zw;Lտ8BNGyuww. FE4 ^zG:W ݼg/pʝMm66,೯EW_gD^˞nDBMdr#(P|:!z0Fg^T"A`gM;k =K]T'.]ϞϞ9cw(>pϒ2h =kP}@˓1f/o}F(fnbu/O\hNѸY 6" )³mGJ8Rc$2'!0-r!uT&/8_<# }!-L)\Դpu+o/RzƸ+/4əХL:GRK䕴DY;Q% T z@{/?fps{ݙ*mثշkvUwđ6F8-cxC?{^)~1dW\ mKqL822gB贼-IJMR㵵v寶Ygr⃬n~ 39n 0؀~?BS _UX ]Mlp?IlBeFY`4U0q8@j*st0kxuM?oǟƓ5jlnlҊ0iq >ؾq-wKmٿVy1hkluKR!h֢{bW_K;]SIò̕JrI\o\ogR.첥=U%H./?Lwk](4eaq {aTl|I̵Z2H iwA ALKu!PH(<7֋greP`J)Sy-{(DˡlIJwf&h׭‡+i"d3ZS[^9<_1KT>~zS/&z4˽3[|hZ9 28Œ!yՒ;x(-eeP:/hBLJ_&uQn/g ϴ L m^PHM7"8x-}:n;OIwquT^֍Rpƭu0H!s+)NrhYBgp$soEYG/czM^$W,3IeAhR8"RmLEjEfYRe}^`ZF{BrӠeFZ[f&QL?-#.;sjͷtG<4HcL>W,2Ld]Y0w—av|%`f_\T޿҉w+`>W—@R?%xgH]! îNFG%)ۻ2 Xq CH>X„0wOуfV_߼xu{sYpj}-F1pgb_q9~?[nJxpҍwguW/{Jf+] [2}_IŨ|A97)AO`2(xI| jQzחۻo_}~_?՛/޿}? Q$p_w3qm]47bK ]|6oyu~}Y^rh~9V/hWp1"IQs^r&tbz!뼮p6),*UYkR**!3ZZ~Oꊽfs sG)yKhQcU,k A.0=U24 2Oҡ# ew "t<h=8oKC++DjQj) XgrN+;moOlj3}yrxrVymBӃ\G쪔t&{(/4sy 9Mg( 03)J?q0UճdKྋ0) F>gL*MAeL0 N҅"$qr=9+4!-le) 0q:[[n5wk } ֯;feQl()^yY/Ap &J V80Ia fJ_B ~zy[yyk[xwonԚկna{K4os$+&E{,s#x.ӊX^Z:ʘ1<Yr3'ɫ«чjksq:VI]8Rhw꒢pVHj0RR0ܺPR\RPUR !=#c|\Vmè59F)% iɵ=u~g-y|LϑL[Zod{b>'[ T{y'տNO񐳓ᓷnG1&/; fT y/4&>_~zZ.vq74nཨ)A^ڨ%|Ϫq힜=xsrnCP]A>m+{CC]돝49"_a3^~TOj vPCt<%\`pO~E7cf,}q:82U'Nblmm 鱈pW(NBFuWꄫ MWol4BBł+TYq*-I!`<& [ <\Zʾ W=ĕF `FqQ.WVuD%\WJB sF2\Z)+TL"ʈp[0xA56]w4)vG\ŵ)-\\KH,B6]w)S ӈfQWVwluxU6>%i㪙;;滋Zs$\5Si;f] W6=5Dh@,\܃Z; vW ᪇bk{ۛhx샙+X$X|%Rޜz2kW#ý&g;{Q6ffO.7f<ЍmXtxyfw7*.KJ9pW^J c~> o:E}'Zxz}v" t4 TfmV2r/X٥ˊ5W#^)ڬp`F3iK_z])PB/cown:;X.ݢe7Aʲ951`Ju&3BikQhd.wjNH͹u_l -s'Ef79,N0n FGb3#r[rQ-]wQ͍fQEA0thprWV UjpC\ K) x&0PƺBtWX WBRP *|+LU/q:& +NXWVӮ UpC\ik)'Ԏre4BvTiSD2m#>>hBvTٵ=Zvٰ)!VǮ Vػ5ّfjՑf* 2 W6=RP 2FbBZ%\W*ib>~쪙\EcTwW M">~fre45wA%񄫃J0HL+I4BƂ+PUJ>J2m@0"\\b +,1 [ @.cĮPP]T2᪇8qE" V hpj9| URpC\.l87 xbWKByq*M!,,P; V$2\ yW&\};6=%" fءfr8.]AqL3l\لm nWTT4BĂ+T+;+TIUqu-^$ví]Zں;ͺՙ)_OtHpWY5;^;'TMF-Z}0l<;`I(c`+qA<ruTJFC W(Xhpr@זpjf].b WwW K PotWJ*bWRQBoE,VT+:+T)yUqWG+lh4BĂ+iq*yr+06"\ J Pl-S]4RF@ZUmGP%#):p y{نj7LM>J2j)W(hp%#$ UJpC\)&iX1I"rQ&Zu\JJeUqe:& +)U$ZD'K. >FP-*g̻\;/ a+}wv⨟.@wO1XՆ8]>W^+YxdZh@gr HM 21Cfb4gG텏b >,>P$XƈXJj/y5FVԆodߧI>@Td $<}r ;eK+ Q~%k7ӑ7V- < dK!sxrVf}}@kb+?҇RhU6(dpkfާg1Isn*UJ@>hfX97,A3( ~SMtvRAQ=JUƤIt0nVi=\7;]8xemч@ǵ%Ee3mVX淥yKn\^KHl hHdi`"b=fb!<:$dL9) ؼLy.i1BYwF80]ot>K= Uxd!RlR)KM񟘀2+Cʘ*1K$Hϥ f=j[rPSwg7h/UX%ZZtZ:-KڛMO g4^tXcvjKrNUwv |h+OkUY'MoҌ)g(YCjajUp`,WKp* s̨ȡɇP}}c b,%5*" YͩbB!Ŏ9f$% XZYh6lYFB<j8+$K6%xݍrTj F ZMZtVu&,TxUSkA+iAʂ'ҡmFߧOȍqg>W-a +͠ kdCOd|/kg,xl8|̎.=9rRL;sgR (2k' 'QPpOKFSS(g8<ŨJ [n XYhE,Hs/-_fèd00.#-'Y7ZJJyTN,{}|W$ۣr"X)})'jITCLQƔ ^ZVԸ>LskP[9@^*Ͱ2"R/>cO?ھ@{-F%PUfhSs3:7rS;EH<:h= Fi[eU hV%i̡}kc&PfTetF刊`)6rPٲo%xx8j=/V+Ba|:&v_q}@=nOj2>ƙ\^-"uuq52*ZBp4 \ȃ yp!wՅFI<5N3c%,jfq RP8g[5@u4D{`y̝vL>phނ& 3Bs4n9-.&t43 Ƴ2ؕG?t^^ueWtU^WeU0_:Wk$W`>N]3Zvtag"{_Lٴ Vz_7-kr]JW?Ws}#]Iǵsq 6ƨgq0f"8%T𸽓((YaAyjK>AaNGY ۥi}nªϷnĘ## ӒA!+۳-1ʻcjvQ0J[D/lKXB;# S @ '\K1cOpaVmGgN(MY_/i>4;ʹP6Pym:4MD24+#2AESTJj$*a'}s@y$XrRy\Dh&`ǰEQ.b%alk*0 !08_9^깶\Q Z+bQDG!P-# bύ4cDBoOoUb؆ᐌ눺QmϮ߂-K$mM@A^ܗ<㷇z2O̐ti  Yחb{UpNP3{?6Np7{'>i<߇{fQV馔ڡUS0$jLCOaUzY-sS3IU$9҂ H|:4AS+{q>\MFv<>{4^3{.mtv0ak,z|xX6𳡨o%Oi=mR%o^&$Y6 Ry 5+^+.E!`}if009|8(ߥ\|le^' lJ5%S^~8e]pV̲Dj&.}DptN5ߨST3BAM+fUȊI/VMQAm}wi~ݼVKcz[Q}r^֢7k_Vې_I؛%NkSskux`IGk[#[ht-poxY?0 &{E;|*ֵC:7\=V|`mQ_1Am{~p5A3m[L \5lú[vCKv"bхCѓGtbg(v12`oC`C)|Q|G S9^Ef fWCח-,?y)$(`%;'v#e:PްȼA̽z[ā.ULGD]/ܼ5X 7]#t:7'epÙ߄Z$GKi'a  1 IMlc`ָwRjd\b0jgqZL{%^gƱJ*-=9ϏñKMaRD;86kpi+48sŨ7>~_ώ|g/^c@ f`(&(7 ߏz-uMA׌lsߤ_!o3vh 7׭Q|xVʥ,zhZ97 Sni.4+̵M\FRbsܩ!D q7C`K I-oL8:4}s9F?rG28x,k A&Ihp h80/]:a3RT΁O#x#  ͠G%" x޵$e`.FGˁ{HX$9íG-dp{gHa!)RM4{j* sN[E\Wm؞s1L]1L^0=0 #zj ӃrB& $Zm  o5g'tCIUTZPϘB\S.*0u1,x,$l9Np*;CoAv/m%g2n7yhߴ2ԟy5ٓvPy>)=֏h\ȧљTIFyNZقg ># Ex5湽l^ Ȝ2891V&/ EcgɐCH}QWFR).$(IJRd\3d `1 60Fz`t_ B2xr+|YOTM9<n@#^''jUAqR@)ſdvP%iMF/dŷ9~Y M=\lu@ zi-AH1%T sI3dɩMB Kc懖 ʼnlHc"5ԮɒZ7a|4wMiA=eS_>ΖlV [v  \;pٻdOjS.vFʹG; uYeHk_H D!Y"_`[qQ,9ٴW͟,eɤj{?y!nlǗrif?zݯ߿*X^[>^Ρ 2ֶLogT`q~5QT[8d: KA- $JKOY 3Uw활 _\2hh@тd% mV|r@XA0zqX#&ȟtXt5zaKLjB*!G1c ‚U GP>0Vy"Bd *8XJ"zUQ _3֋HJ/gy`+ˑTq%HH23WeJ:giYubWn-IwZg>2CAza6RN/GtM:b& [ [ Z/k e\ b;BErs/u^*-UUf(}TD[CH9䀅A2͵>'1AȎaUMIZea ;ːEf)99!JsNz6A0Z_#n1pI6Y[r~ :\S廯\ޕ^볬욮gVx]l鐮l~ȕM3vIs_Ƞߘ []~^n֫[Nm6+Ⱦt4vlowjxg|j~+] ݼ&d, J61>3eP?nM3^Z|"T _*MX!HО(ǵ);\0't@0)(KFjÚ䅳s].*@;0{ra+'C5HmJЌeA{L[722+AE vwCmT=oIr֜xj]B L*;!$RDjcHh' RhBhXT.2cb1$$)JF(k,{i`812r %0Kj"8&u"+ fh%NVjl(gVdp{16[tPrDHhV oB)Jbs,b4QXa #D|y-?S=?r\#U!kT!y+ baJ:$+4Fl"WjUZb4ItM“ ,K.1̣R6Le"RG3|GqP}+F쓌%sR +,R*W&ZAy+P ̓Lvz+"D[ _DOEȶg* &()`ҫJM%fDF!d J* cGaA:qȶa+KA/ŰF?Z~҅]5.]2HMlAڢ.Ȫx4I;xJ!)Ķ-Z['nI!fk$~+mIްӐP-FHJ I$噈`+\ b"^"RtI@}" QF]YdFR8yzqҸ@P0&22# g`*I89&]Z5OQ(KPK"C )r,SJ gVLEԆxQɦpP6)H8駦/ c1 MWJa5YK柾N 6>_fz>\4Ie׿~?7iLG /msͭ~^_okm5cGR|<:׷ ^_ڢig`o_>.xѵ?OHv|RG|4R˩C?!JX,=XpCܫ5OഁRdrBYj⍯zt"ٴgqL?Q< DSoi|՗M,b\NO"ڏz|9j:o|Wԥ8h߅a¨0CO߯4v܇ wVё~(N8aRB/qKʭNƅdsvQeRTBOhzA~\;7_(\D#U/cV,)$:Kks7"\X{4gxaxMXz/~m$%?&(> "vNY7(B3OS7Xb$䰅3nr5=Ɠllo ~u7w?.^ FQJ!3hBP9 #g !hʹ4,UV`4#L(]Ged.d)Un! c8!dωP^tEΆ6qIYcT38RF 9e3{gMI=In*Noq(]h*tʼʜcb:"IF}`#rmK,`bVXN`Y%^ tXTn-Wck|1Y0&j1,XZ%yAhŔYM0^2v@26!ooDzf^}D-C[esjnŰ_* {/C8aeG-C8jOQia(C-2[O S3uJvYZcZ4o*IÐ /.)A^ y>c7kŽ@Bw`YeN'/sS7oĐA~>oRrkP鷗)GU8怄Υ%afb<3JH i "B4H t$ *[I/zyD$$FH%JfceAFQ'M2]OwBՌ^wU,Uqx|=wV^˼@r$#!dƳpqH99ȉdcOukkrVjkS\hU&eDl(LZzLJfFzXTӅ8c[]h*B=;Ņ.. 2^r|a=4^ov+(CD$ McaJM)LB)2ER@`ٺ[Z<O~ ~ZQR!hHFd2v̺,!8T^cYbZq(Z Z{@8^щiW8LYrA'E9 Y{R&+M(XŁ(7zƭ<,ϣ񔓀?|tw_y9t9= rp:L⼪Փio6[% @}yϣMf#ֳjQ6-}_n]"&GPdr2<`wU<+QҷèYƫ^yl&g_^y_KX%.~2mw&1>3UY`ovJ L#;/~TQ Bdo4*搭mjy[yJ 2/fͲiݠ}RefbFTiRQn-%ѾfF \̴Q 9isP9k([Wf3! 7:q E=!l,Nf RT@NLI UN {ASuva^~XZR PࢭtT(lf9%pe)[H nEmwU_#ѥD@d6HYJ*H-?Vr (5F.lRity:0;'|-3"+CdQ!Egɨ콕.jk_+fBJFA~K?Y\{8LL:ۋcS 7js%3NQgp[rwyΥK0 \H߿)6aCIuqTE]W<zyO{dI2,BYTJe+z|fLc7Kz<!6njǏ;{u~w?~u~?tz:L]%2+Rn]]c'Nd1{Zt0i_rFU !6uv oE,/+uhb4Oa@Q;(+WIђSktBY J9991ipo8Lydur=.Bt*"6BY=RedL9:,3*;\ j_j_4'6}gG7Ybޓ6;{wv\ޠK>%M"xZ `@ ;\~07 N}Cr! `x= 1q˥2d\0SVR$扥 ;@Ŷt'Ml-v)PYM:q}E*8׉) F%6QSY\(gPY1J2,ԧr`^I%VL=3)b(UF%s,V! }krd;jxRd} xZwr!B> ;>VSE@ !י'G/Hom-Z=vQ&XB脵*}@!QX>5Q C:ĝ=5~玷 f,<-dwBQ܋;{ELޢDr9t~'.'uK?^?mƷоu{X߫uGwa5_uuD64^G ւcez(l[BHKYL? ʗQl@[Ʃ$eFeK'OaJZ`E[aomox $eh&򂹘4`iJ&:&iiFw6joyar7=쑽I72BƧGc^iFj ir-H6LR d,ԋ^lNU+2 yF bw^M:^©JDƱ-*/eR9LrV9F@x!>HM-nkfH\()SFxTR:`Ix2&<93%VeZ kD-jrWu^&nq8Ǿ>.5d}Azi[DykH&;|`&@!LB-}'TJJ|1Pq򦇏O#Q{hŒorqϚ$J8PfAxxkIY*T>Z-HY^@TI9g49d^zA5 Rv"YHY$3(Ө)'V[jeR Lc*e@8P^un f1o"_.jwD_}}<ƨƃtj<_H_p.I!Uc"fiz"ozp5 S:y6+IEr+WMV/?!&v56yFg]mϙy=h{s=# Gjw~YF"OlkN?}u|5(&swI˞n*{tk-P [kJlҔ("ҒrWtmf.&tuj:DH=<1%cZ]}WnYehѻG:ڻ@_`-o`^^KEZ $7Re`䲈DᏌNHFe OpSfo7NPPR|6lr&q뽐1K\k3pU`JOQɒ[Ԗ}v{UYemUU*ܝ 5ʹUHHƗ*+^S$y]R$]"3R+AY=vwj]T=oHK^ԚTYBT2g*"A hN$Uo&oHRڕMB$z87zS ҎFDϴfCs#RSfI1̓1NSvt2gd̄6vr}eOxC7\Yx/%e6 j&d*|O` l"?{WǍJ8 -/:a6 ^>$AI[hF;3 _gF$˔5ۀQwOwz(JdCåԵhdȈdWMVÇ2/`!PFh;UYx$ȷS &t IȚE Rc<  В9bp*c)+2Fok7#¶s_^5"u>6{5nb  #Ka6K>(2Q&֖l0g0&Lʣݣ{0zm5%N,Ja} _mqyQg˥[CP"QP6 c=46*|gIo9($P9ݶϢ$k4=bqV3_-!)M4$pwBRne˗bR P*p'>2.A VֈSFe ȨTŧR±>G(,(3 UrrR1Omf s]5(H$dQd(̔9jbH 2)aTH_ &3] MMtAG]eB1,^w}\5O]?~^yo_-]2LgǛ?-geG:_oֽ.ͺ$N~MwOz~{jӧcLMN6lrH x3]&uuQ8nڽwlo#?qZRyH!{Wɨ`!RC.թns{uxACc7{b'Z,Y727M~7e^#nivh9ckÔr%\-͙"Ng՜r2pbu/Mſ.oVfŕ}/}W;4@*!m.E1NҊ)^\S~j/IViJ !ޱ{J^X,Q[,Y6+S690x?B\2QwCz Zʌyhi^͸{pw%%_m~:_kF^rY}{`c7 Fge>k:JCXTsYמssSM0:~|M~nYUg/m죦S䝮<q(t,z^0s2#қd&'t%Zm04:ʐT kT@G) .j5JfPzYh=ʜ?йKK ]0zC̣ xi0+ 0Φpy-J`[y} nBԄXP>HJ0< 1'ᔳ%UN`5^@ejsvDŽ"ZQrF%ųE`u%ɨ RhL9vޚݱGk_pW8ujŸ^uOUhw{9BP W?zT\rڞuAG7&#YLɬw.˓VYH XDBuBZePYeSs {E:oB L"m7b)Cej X<7ӏ;O% p'1tR܊DMX&m]gDii~J9(!XQ;DL,Ǝ̜Qi)B0e~r`J8P2~;]ZO05Eg׌E a!mɱ[H=\h a(/T"*$x4HIgKDQRR1drЦJ!'W % %B( R9#c; bmf; :'o:yw ZNuA?LGYGVhv6&+Eah*[B0LeJl 3ֹ:ւD6d HVd k;|Q3xSTA5Ff\q6M$lj;v#+]dbZ2yURQ,"SN.wd׬t3&eچ4̍C J`M:[~#1Ub*6"̜pumw~r]Af㶈#qDī7u^ɹ/^dIfVsB_k01ɒ Aޱ4Xg]XEƙ"9&iuvDXRZo63zD|:󫎌Me]lKnZ4E?∋Wx bhF@0:jA+b$cKrZ"Sw;zǎࡾe;{2*cyUxOя:h#M qn{' B}Zl}I2:~WR u-CRKV"ApZj)!grPi@Y( mWk~~f(a)P RVz+tQWZs=wf/ E6{vTxE]E_|vf8=V"fܭWeO>FyO:qlf>6*Va{ ~<]Ҁb7`Aлble.<# PC&LSVZ+D !+#P_ԅI$QiK@<=Ӣ(Oؼ}f]?_}j _KPV`H8W$*gf03af3.m(%$5 ه ّ2H6J ZE63z'>c]Y\) 񹋮ɽ0=&R&i]wTWӋe7-v'Z'+u^ƌ:O!ͦEi\ "}F7}Rڸ*7*QK';tГ-"t$ AjcFTaf1O̔Q>v 'UOF~~bЅQk57?ǿ|>d)BSY)H)AM<+_l`^Kt!*67n%c&{)A!쑂 ĚƑr%i-s#22143zg=?f˲-Cٹ_*rM ۮ+FeyOk\e$/_|U^Z,/i6O j  >{yU/y9.,~R)_}͝|N懚d?\c>c>Z' D>JE^_ ٲ.faa `Ι~ $ۓ?usu.3.lsyҋZnx>U#56Ux8avgnu6/js[@23'bϿVP}^K EYuw̫OӦDPN_2D:\,V!zϷ>5H}$k{^&ۣ+Q)YqkgvuˋXJj@EB)U,zKb TT>$[QBJP>IWSv,FD$Y"Yc >&)p&@0z]+sk\~ yytѱ~Οz+Mjm{Ҫ+aTݟAu_$0C7$f#ֻt<#fBE2˃Qg\MO*]4/O< [to@pA$_l߹FOoka೒IHz.ŧIwRBMh:w"[SaV@e;Y]b6a:sxjqa_oԄ')XDaD+8lT%*vv@(tNu7%`#0|)3"kT2AYm-t9|v!ְ$4nhmcSrlgkXtL^_望٫z6`_qmPXgڦ <ٹlҢ怵 鬗*+l\ |:гzFyt4RFJ*d*HS2 rBb@*#ک`RKe}Du w*:c)V$rL:*n$bPٺCX5z};?>:gH-MG PuF})j?F|W^_[~Qc_ƃtAIH:U:Sz$%y?Z#):~$L&h{Vɐ,$W5(P-vS+nɩ u>>8:n?3qۃpӋiq _jl~p8eZv&'ogKُ?~fo{j׻gZFBd~ vbgabtt%)߯YLRL),5]t*,B _O+ǠD)B 8dN9,Sg$#;N>;9ɾ8D׌K3c9 X`0K}'5@6]P&?6`+O[UDNe*IX*de?[Ϋ.W;ǹG-N{ZŽ R%i[}-󣒲wށYs3-4V0,&NF+8l~ϳzٛ '596(Is|cnG$ϫdovčo_Vp$rZҴPET_otu~:|{;G4#MxRϽ˿7kxrm_chd]ǿ㯿7M~-ާ@ep3#~M!9rҩ/_o9?fZV.9TWCzLU+S x[6sXf)y}Zۣ:{y@Q~nҪYUSW1MM<Ւ)cH+͆Ś>7m`erDPp$U4oKnTumf[bUmM+sA͊qGNsɉ6л m7^PRoЯ8V\X !l0!䔍 3)y0taJڒwF f/@>v> ܓ&p|.y*{$L5S ;T+l;׊Ky^IieGxc"me\pKL 94DPD 2+L*.w._`K5 RW@0ᇣj)RKCX#|@*G]%rDsSl^#H]%?H䪃AWZ Hw+$VLᨫD.=uezUN]@u%1B␖ځ`D.fdUN]@u== ukF]%r)=utURtE+IYMk(F%R'/y=Z9ۤ(@#҃=υfs u Xw9fއ%S7e';z@[>KlPK(eXV ]zSMV\cˇWw+{H C_Uu]hָ_c<QAEV;7nGM'- ,e<& WT;H/xG<^r:G՟ c 5{dD(r͙E,7Dc»=]=)4)Ф܋B<@ㄑQ"%8Ƞe"!L؅d`Qc&[u!D!e!x0kq,`ZFLhnD 9Vt>%k[}3sAfhڒآ;7pMl[Ƥ1kL>4z޼xRY.ҵa\IZ0QઔȀ.VTVEq7u^n14|h奣 (ywjx|۸Ms-WhP7yj3;_ӱ7` "wb[upװf;8fos}9]oSk3ou<̣s-ͻi kM^F}]TL%r@2gR\`sifSl]m%K_ⲋE)W݇qnc+bn_qO~m;W._[n%~]p>W$rIڵ >ZYK0X`"k"QKiJĨ6xΨjgokg=ݛJ,mg< 1Zn4/I^ztDΕeNF]G* \6bmP&qx΢Y`l5L6 -mo)!tT.Qj) 07,SaVP0vY4!e9HysQ bZ#gs2zo9֭ r7%`JE*oL_2[ <0Z:9vБ"JOv"WCjD5DlM lۘ}s>f=f F.\%`9Oƃ6l:%gHE>ג)Eӄn-ݭGFT+vh |ҶJ _Ovf<>$ݪjT.ʮ;\6nC*Jڐ F,Xz1 Kh>LL˲w zWWpt=\7Ѿb=>{j;W -jzQkLir*,g R-+nq q10 f1grbqb#VaS-eEW~CY&&VQR`*ƌƁZ 1b"iZ+IpqRsҝ{vR?g}~J2}Azͣ f=XW( @ZփVƂБJs :I+wK+>T?뇢Op4nN~!+ƣIv3rv۲(+!3u2__^_e|(y:{}ڜ `Z+8軛Ҕ%v&j{ipúm\€Oz8Ivt{OIpN,tBʑ!ivdHI\҃e%ROR\B|Gf3x PTdod."\^uj2jЅ*zٿGWa~~1NY)q7W:5&Κnko0("lF w@p}6+iG zWȎhuS"zXrKaBB7FsAQ3E.WL셁k_5.`@hP>"uՋ*Ү$Y,Q *+Ng8 ]%VI~ד|㍤dQ# z] e<B{RKӳAq ,I)AH[憡&aqL'GW5l0\fƗE_]k /cG-yb#/EqY`NZ˥<8.0!k)Ch`]$\.%6!`BViEe)ɻ4Ā8z@9EǶٰés 98:2͙LQ&&:8\E@ZIoL@J(MQ4 =uW*=|CDިeK(M^cGdTց X@UN!-%Oc9mb^J㤶8t֕(J"%M}lϚxM[D*5q__u_7@_mũ zU*y*MU6._v!ȝ4LueO>M ,S`+h%NNWJE)m{ ^q"R mՁ*I,*!o&IyCLQ]v#ŨdboCp[*XՉls;+L()f Fޣ=n/-\GH׹_%V`I96R> oѡ$<M֗wqXl2%Ϳ :hc'ьKnK@7P:Z)sn |7mRd~8r'?bbd+18drQZtuΛ<kK+o뚻;YfOukʞ7Օ8XFEg3U:ZΝ.xb* yB Hr$ .H2L0RC`RBG)o PQ*KR9#c9R mPBcNpaQ~Yfk'3y_~[j>'z#v>DM8v6jANBhJ4220 K $h@FLY'ꦟ5t9 =ύ.T8~ G]num v`޾"<[u7Pkg˧x΅`H@AF' I4aF[J4Ӕ *aOgGCZCc,GbedF)C"YY h̥f9#I3! Sy2]w$wK*x{^ΓZüQ* [$p8v{JLJ2DJח{m:}Y/S5}qd:nEJ{vCX0[Y/܁BS.pg_;;j& 'cE?RBD[ :! FYa|(*%%-V+]R|Q<GBP _<,~)kEU(XA0q05z6Lrf{Al3\ _.]q#yz f vfW:fWl|3IAT+I+%O)g\xRAۋeOB7ݹ&[vh=^Ÿw >ưn},Vx3}^Apں.i|]8®|HƝ5~o'?xân)FJG2U8 LD..}5a||&gض?ԇnz6G}y^Oq4}:{v<-jratLKoddV.}Ͽ?cu[?Nf?cUJfY p~=NNr_vÓ,!-Rȇ/Zu~7%^7Ԩ}&bc?M 5f!nrrhsq o|7]rwxޅjbW=$ >?Fo:D,z~'ߡſG0_tՕ.YtήS_7 ޱM7)юMe9:wcQ}MI `2`oܱI5/+u}fآ2,7]Lm31Ɵ 7Ot7lu);4(1m;KԽșs"nxBRBfIT12#/J"$qF(4',L Km}V'Z+ 3gkI&F HAZj*CP DY!'zY2H%{+zMZ㬙؏cdV=V=hWU>r_ijrF1[)!B5,S$JJp*"V䜫Nos6ݻlZq:V߹Uf-,1')9G/"S4D4 T@2Cr~bOqdzwHzGw&QɌF)': }Jn@qf;74^~ɟn[z|Wi?w3bgW>S5fW*J)+|Gǃ=sϕsiNI5D8Hvi1cS$ OȔcn}+s^FWNKyzTi=݄?OG M`XH@‚6iOX1&oρpֱl`*ZJȅ3&0Z#7 QC tb\߃W򬯠OYZ8hW[nڧhC+ziqLT8@`,iiSx'sܺ3esS%,P#`Eʇ_ z@OWGWM9ᔕDPJ ܊ޘTBo:BϊpT HTj5ZkE\ 5a^*a5,YWQ &N2c6q9\§MjvIc{xHKTp .EJCxPD` \L T3w"co_CE{FF\h4YʬdG7Rnw,$^,adVQt"XU1cx%wU$^Gjy5'ro#Gje88( F@&筐x*Jq$f[$)N$J")!d`$PO1&$?L+Mx*Ms=1B_`#m|.+OSG.V=yC^Rн+z1s8 Y, JhH9Ȁ Ya {TNy 8̰\&swM<%7(^]]ox40Z '۾*fwQ=xb^̰x!H6-*fFeEDȌ|a(L\X2ŵ"@nP=bqIR32X!xmRXZ9Å8^!A;!텞¿+dIF~J ⺃y݌Gow$|9qr=&E[O9X*LX˘JjnNH\Džb6z\FmBqo ci$Wt>XP^p}SF0$M`0M6ՠsC T {{!cP%G8CXwu'ZZ)OV %\"E^Iz1)H Hѫj{miC2:z&9BfGæGJEjv[Ȳ5&!5ԭ!ݧ|ZGnI3r&hmX'#!L%1Q9 >A]@(cZqXF.V\D$HxS2͖ZG͕"rUS]` '}Xn*0={]`\WLnK<਺LiYȂCn)(kLp%gh)~:0:<=ġkV3@ G0xdc)i,+A"8:/s/U}5fcG`%7ODlQ _Ҵk a.rGs\閊2/h8)'7 0Gյgjf]Yʻ'oWmapv!̩4d6-veTmyď;`l]OZ =FƮfUYc=7>Fbǣ|>M^!Yffl[FJNJoI,M\ WWپ ?6kT j҉_f FU:?w?׿݇~z _~xuh1u"H`02?F׮y]ߥ_#Y[`ᚫy%f/gw,~4t74Vmo7rFh#Uf9!kCG"xeBm9D'>PzϫsVak&H02k)h#Sf3Sda۠dcgj%vHh^)sʢŮ]R/U}Ӥ\t=IؖdSf۱dz=јg L,.Z#B5%Ɵ7XoFv=g<{i"ߥ B}>9=P?s];UbH ^5q|T3CMqz&~NTQw?f [n \|KFBR/4(+tz*Ξhȧ!GY}T^UmUF8}^#އh.eR"P;P;6Nt:pF0[VEx^#\Cd*8rNJ"yˊqۿǝr]N??`.=냏21ſgt9|a'Lji R'Jx0%1Es[^mnNI sxIn]< h=[Eah6 ҝ5Yں{V˛7&$ʝa2oho~us|]t"N,g6SF9 Wzij~뮧r^:{CW} w(j[|scXbGۊbNJ#\7I>?]@F/`\oK2i FT̒ېg?r [&#g>{k]ۺauSjKkyh!K{^n+w$hMFLjeN"I+0>L J!gSz|'yR!&@ۇ0HƄ_pVGiB\;om T:hP`EѼ2'+ck&JtEaDU'ԂshW kLvh,eQ\ttQO W9+74>yMty]մ:9`T1Z2m|Q]#P5jk'mp\56#9.SDR(@DmKtHE0<%uɘ! :ze<dFtt3N8 Z~ Dccי8GWX2*"% O)S•2&eR!8jUviUT^?=jtA ˤPձ:<)т?Nݹm5=OΫ74Ie݂/UW!Y#BfHy8c,8;ʊcsvXԹO+Up:FE8:SX 8!eS6wTBfueOBl[N~:ɦnK9g"٨6=@Ñ6G,`bVXN`Y%^TTO)j_}6Ƙg6&E2H`SjSOfI16gJm&8+2}[OZ߽[W=cNǾfe`쾞G3"o^-$͓͵lϳvXZcj Cy :7֨TY$Aݩ)^tyYneD%ࢊ*.Os~):`X(cLٓ IpHQ7e79 s)wIXY{4Pn(z /!Lp!vI35dETJ^]B/"uP)l:+q*O#TO }(֙ܫBo;'yQ:'p3fDKo 9p3Ct̑@N3v&vVvc_{_;EAƯivqGB4  u*5LϿp2D4Ld1&(2,BM!Y$LRcK+3'_)Ε)dK&SlǬәD-5u;؝c(;ӎ}ڦ=ݤBfxN.Xb%`ʒS9e1t"]Tsn3҄U(B&dȁ(ѐ(l2RVE x߱?L d8}}=##qs #=6xcQȑ9i/ی%Oks&Y]DjYN.E3!h5jn$)`7b)&ԱGLU9>uv%u]{Q1$ZƠx-QqΠ0nEP 0 8ݓUBJZvκa'UQ (eMBe$;L. \ܧ:զڔܣjSr/Ru999U !J.9<]oLy\S=S< Lɪrz=glR{CsB F:[$L0Q[c8ƤҠsЖLtmp9!@p Fr#[>nS<$2A>F#dؔl9~O5Hl[%a[,ߩ: ]w6|QR$0V,5`J_n(+ǥEQ)k]y`@2&+-Z+1Y#d P |ϹqnqKs0?! xV>3"^O5H$7'XG]<' V9%"GlEBTCT%`(!\zϒ@t1*-X;Ά#U| YE/J#m\8|U.瓋s4 (R ׷T/m!GHR B$# )+ --hνs|A,D¥ \6@It2Jt {0KS ĩvhtdHfEZ:t;aog1Y9ɧ7錚x:^׃;jn,ԍ&J[usRK7"}5M08$4!^``*q:kRŎ*gh Jc,o5<3EK (e\Htmv69x+z|Yz[M",y P4Uc4Y ٛ y&!.)Hl!6-OrcsiǤM5,3ꜴQEtS|h!&-Xi#1HN5p\~s~9m|n,]p B>8hʒsp z 0vo՗@H9^\Cbɋ7]S$" #,|PA:Ѓ֯'= к:+\|p5C '7*zՔl[SƳW݃trx b| FS 9EJKhqr1(Ϲ"(9ϓTJ.A %qkݠDzΛvŰ:AG+ן}kג!ϡ|s#S~ZeD8VY|)wxp8qZ/iO8 'x Q[%YĽV\&Q"3MJ !t*ĥGb"2438ȜIaI܁ZuΆ~ %W9D$wVlq7GWLj|2uT)߸ M"CE^D+r9JbN4z\EP %)JP*ю+łg2^^9ɣ yk.!**}"` (湷uDi@HDdn(R1Mpx\/d7F3L2)Dqd{a J`ʃXqjcgf݄-<+8d0РbüdGx !n=?D}_o][J 2$ɳC?yH"'ʨ@`2P_6 3w6Eo?^9@܎CO&e$'$]x%f8N%vࢷWe ?N9Lxݎ*-nͩKڻhR'whb,1eO%v񁪛? U7ko8D-{}([TvxQθ<|P e'Pt6jdRyտi y~?#Nn~D},[(ɏę`I ?6~_Mg-\Ԋ*PĆB A Vk6Od}?|ZV\ zgBGM&xvx5孏^<'k /ubOpV6_~Ddt/צɯr:G{9,6_F< Be_kcm+h\}&WӵwU\%o;Y">`FVJWsah ss_PvIY4toy׫C_vqk?̖?M&p?t@0h_FqU^n4tZ ʏ Kcj[DHyCuw{FTklt3mސ<.&ޤdjY@aeg&ƼuL 2ыٌٛ{lx4t.JK.QrLó.*,ґ`y̪HCURabk?h6gr4wfӾ<,$NߞwRGGt|3+$=R (HHԆ(5@Q.Do!%TM]/:z}:+*czgWZp/OKkǸe+Ⱦe(I}T Ee3MճEeۊtdDZP2[J_Dj뤤TFT'I"҈rЀ%}zI/(:B9m,1')Y Ѣ$; !P0PΓ\dطS< jPMec }z];J҉ Ҋ &͔q*D5kiDd 9^4;@Uovj@] y3En8l?;~]_v.5vR~[\>:[s\DkD+Oƶڭ{;Q#n>uZ)ih 1DQabRIl1'f\0'$Koq]'e,CrbOC(p^v`j|Ȯd3&z1e魧5^Jp/d`z.F L'H4陷[ݹ?gS튴`Mi.r$޾y!Fpet,rD(Xw`%,x# 5/W8XA'Loռ F=UͪW;1N[DrD`IPh 2!Xm7N2scc I2& O5Zks 1%k`pFئnҽO>ܦ{4{>Ql A)8:lKvѬcih7v*iR(ͺwo32_ \eR+4+׾& \Bqbt;z:Whv"!B9'tfG(.q!JWXG=+j1?t&H4`fT82Glj `8O%1P-IudIA1RH7wxpV[8Q"8FQ&]P9x˰襾0\׽T¦H@Պ+_Nf>di.7WYf󥇻n[^˦*e;U!9-"Z´&z%U[xLT:C [BLRmZP$5Z/ӷjqeL&yP)*)MB7ZQ#EF Û1cEPo`?χ;\sƍwtmЂ!70Ix˂׉Zj7KL3TGy_˛T.drrU0U)&TE+R$ IZi3RO @ǥT#yK {bȞ&4*`m,L`aQLgOfV SwD|P%bdT唞Qfrj}Iރ)Wnl4Wne-n'6t:?MlNmک;7V+M[_%NN `HծG~V̽S3GT(nV2\8GgH3P9խy~Y\·WHr0pFórZ۵"$?Fv\Q=I'˶ǰǬS'̫5d  $'3Ι'iwďmF_=J1 ,i5mlZ,@5HFM+Yvl_[oZk3Ē;$3sHHΖKql>g'㟾ϟ}/_w_RDz}};j̄UL O\y=fG=Z{Ѥl÷y#Ͻ:}q6aai_\OrVpv ְ7γ&Xa]=\/ʬ-^*O2 q7 0ѩ=-~nI|D)QF4* U2s VY;0ZcU!SEWe%&JY̛緇pћas lDͺZ ' $\ .[`F:O$;|NGnb)|ߝj*4]jteWq ӄ5Lv&Hh0/ay;D#c?j^+}ji^[ݵ_8~z*Qyz-^::mCM}|~OoKp7/EQ@3w4~}{YKB|Zk dgφ$9]XP/uEL5]_~\3U:[u[8[{@Cl8pӮ'oW/ tHݳ՗?Y6~*>fϾ}1[iOWг=L6W<~47>mL'nu痆ހ:u^\v/nt\}/Hh1"6ŔKVq,i=.Q=ȋn^wG./Waиwh~vv3kυM޴s"AؿWxs_tœVA{W_{5҃JU?7RwA"{7^tgpų5Y5n`^ [ys|-#a:ݼ劲6]ߙ)oIj8_#w_H |9joUYT5dHU&ݏ6(mQ~wyo/ַ=Qx4ì=wp;|pQgRE3<9B4Z絥dQM%Hx\P)&jQBr](cZ"SŔE~u-^k? u4JF;;7Zk)x)';/Wx'/CS ThF I݊-Šc0ɺ쨵ZCд#۷rcԚ1$kFH)L$e`Xh!`Nµ>u _̈́b &3)e%lFZ+ \.SՇSºl&LZԩeɲ88I !Sm9 Ф4IxPQɔ F`o[Lp+XZ\LdW{! ^@a-.FLwo㣘Mʽ7lC ^%C(d[5J).ڰ}5AY55}/挑s"( F[4h"ȵ STBf=L1Oswc5Ⱦe#8:BҏkluĀ,☵(̔NrOJ=i^:gICR [J oeM1jD䒳"ac5V&Y)dS.dGoU7W*KI!oDJRJxe@njl D)%[0݊ NV` <6ԀEAh7x Qj1@yUc-+0 hty27p,Մ61c1:i4 ]K ՐYuqZAj+Ⲯ>^, R`N4ZdRZJ P74dkjAt[T9EP MilS}*)Z wUTIXI2,#<"fЄv8d)X="z]C Hd!f@57._s/{φ+1h aͷ#),kŜEN0t\"0eWaLJ.̐b]K@kqM65d&/vR A3h*3!(.0͑`-I{CRP6Z'JER/mveRs~`XPh $#(bem LcSNUDRhH 2BDjJc#`V6@۝@VAr`7_&8M2:nq9XvӋ ~)}ŬQ@r|1FL^c B /{aFYvQ|ך.>m:@G* ]45Z.U}-ENpE A-)J# I VWe2'Eaz ƒMzOy }IE@V/VH!3X`~YX:7)@JTU U;#*)8`%g/$Sƪ`|}k$2J^!ExB.` "Q @r!Z4@ @ޘR{pY$\`t?/c@AK^/B>J3:hLMD0jik 8^ Az2%Ykڍl: I,Bi9D@?x ZA 7Q%o0CIaI6c;*gDH! cȡhhͥVa`+,vLk$$BYxf@mJj"*^ `NQc-!MJw+ a:H؀ `Nb$ =KPao R FhӸ㠭zyc7=^|r8S6mWM̵DϬ- @8tqt3(D4Q\ۘ1Ev໣ bjSѭ1}r%RV ]"4(ߏW^IaFhQ"}_wäD6@mJ e0Ck7vzI,c5:#S :( 0rHAO!HHY@qoڬɰ~\JZoyE!XIBiv7H.z*=w((c)U2QLYL 3ETXD t أtF$ a^'p `NLUƺ3fn4ǚ4*E?I*e%;d$0r$jd-µNZ"O[i՗WzǾf1TBmEѫڀi#ǍH0+EXAIP ̀|"beT}_H 0!"Q rR9MN곱Ĕ5bUj4,M/2caVMrƇHb KOq%G,X2輩G!wT`B0Հ[zB_>h^D@֒Ƶdh;R!PN n>)%Pro@VʳsTu<.Y J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@|@N>M?J k(z^ +f%w@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JW ->)*7J XÃWJJR J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%gH 6no@\'E ԭ +`@H]J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%ZZrkiS?_R꡼^,__}εU,/O/G D+Nu (i]3YEy4l٢2p=}zp>m\*/'ˣx3&tM"j!wcyc_ÌV xa>yiO5 ~D5}Ƽ=={m B{%ַܽQ.ܯ*q<}MFz~]A_^ǭTR М|Xo/f'oz.ٺT4ax_4PJjs"¼G\e(O1ʻO )aTey^^zA<ȟd?J+~.41.OsAڼW>gE7gdAt-97'g<}c +5l%BϹ҃v;5dS̅_2ڥɾF֑C[Q4XF3{I9weƬ g#WP -CrpjHo.u&jgIJ޹a%ܮ ?8vsc_iLce] wS]Y/S?l`pɹe?0bj|^z+ߗli?`NN5>\iұǗdλ[ 3LWv^ez#CuSTY?/?FFQ[$ x^{uzX6SE5E*<clkƗ?_Lt(ewWE7Td` |a<:Ϟ'1 _N_v8=:p95vwNǎۢ&O;wpMcrNfivwmH_ѷ6@{-0+VNo/IV:\rp^ g6gy+rT{w.MIl-`[lzvz%[Q\6V3\*B4 Τ'J;J%bckRl?g򒙕>3;i>R8^ ͭqrSZIշ>[lUf&%^X\zޖ^-v?+O'74N~6NNƶZ ٢W x.&isn'j{X5 OȺy1~~Y-#&Yf~ C|ZXi&x4v8|P*Z +`E?\|o ZYzxP ,o~dR XQ"(!z`Ѓ`\ ,:$wF A>R$ -G>sA3q h^c¼U0|G;Cn$f n)F$v/2LPc;ڻ9Ҭn`Ka>Pp4,.69lx0hTF]@>'x_h^7f7E딀lfRr_;`,wrs՟wا,}>A|f=*rЊjZ\,?Xr{be#?-9.}LՀY/L;Y!{W/gͰ/V<$i,|o_|Q7ľR-m΢ JL3&y 2>SW?ϧNn,ZٵX̡Տ?zTqt~ynvwIwӛO?4[g+ _o-J?HlqWߤEULmq|Z~Yy7׫Mzwxr}~kǓۖ^~lmzG[Z+6~4TM Oa)Χ7G xUYe ތA=r)bW=xv) * aVerOYV|0Śswóƽ˦1JaJI4ǿtoy߫8KӟjnpSGdL vn9tNS?7TR&yJ20@ʫGfVsW݁pT+wSjyzi3fZl늏̒ǙϷ P]knw:~[SL`*JI- 5făN&Bb+`%:JT\s4vdXqaO~{XJ^uf딍 ~QttRg0L {1DN鈮7@T+m qqTUkufg[ީi'}7$(`@!K.1wNpGBK3,Ń { zzixaQ0>Eu567hou{wgʡ=£U+Sܚ[nUdeʱu׮MbYyC|.I5C խڬ&&d `enj(XeZ"eLE!A ty^|7cXB9QR*`ƔئѰf5Jhu 'G[2f lOkTgdpu>eAw}&ݕbTk\TɌ֙:":&esEf&հc>ֳAs4#\i(4Wϵ1r$] 1yTZc0}LP +#@cN bnx͕&uWN>5z'ګ:N>쫯't61" `gyZjaNz؇ c:JiCH)Szx[ 4@-Xhm 4wFبQ1,%`#'@X2=ٽ7"vx.k-:Edn+4K],E1qߖw2g2{P!-3D2ʸ8f{q-!8rP!XUHl#ƉProroAU9A {OHR.ȹ5p?ݫGK[xbΓm0vu x DƏri~l 6D(L!2 ,%& =ZbAѽA c'Y!J "hɃ$h^ qrRwӒZ50x2q<3 =z4n p^]d CkkpaFѠA tu5LXfkOTt3$hX, Qݛ?mӹ}Sوb+37R3Mi]ƳE=EH%Y6UZ0EF3fX[ezjZ|jF>VhNbjuZ[e0LR)$rP"A^H ecިËW`D9DŽ0sTQ( Ȣ`;X+\$JH!} J٥p}AӁ5͛} QѠϾ?}@@=!s3JNq"oL74Ic'@*Cz)[6HF7g)Oc.*5X@U=4z$x*7i?~ܷWa~R0LGB>Z@M >lNFaK)=շw.~;$7E>U~BTᮾ惑ao2}=o|v@vtVH "TV3=T \d)1E,FkdQ`P["4Z)c TP?0!0jvek^jYϦ_+MPʕW%N.-F]AT7wDٹ|*M@Y}7n`iIsnϰVq2S%F"Vc‘r!6:ʊJU[142A;ζ̖TxGD71dNF03,]N+ , (O+@2X}=t=* IhJq,5`G8xړ`:2aCAQI< ߅tHl2 ᰂu6F KbcO|AhblM836=ے< mϯZKN4ь,|eb ndJ˰iF<$=ֶ+UJql2=SCSR\"?'oab{Qe M 4ݧ}Ne eSDNfR J1u'(µܞ 'a ̊N[lB@!YMg,[v- ]̶EbcqގSwQ>wv43r`dlU+%jU(c?mWh'nZj%ˉ2 0F0y>ZS]{TUO5gi*,xu^gٖkq[cb x( įL$_2Z%7t nFu7:Yބ|52 F0bɧbz9;[%h{M'ZH50}]"rN(YO M+9V%| Qsq^xvooxs믿&ūo\`.^w`"t E/4jk4#}}}v0NnЎ] я7_qᒗmH#LyaueLBx`4c2FlZ <0mC#QP8a$bcP%VqAYP( u&zBd217LW<UX% S띱c"M"hj4BZ"zƈS旐*_oKH3 u.nw߉RWVA`# V܂|%fw1JfwwIOg&Ze{3"+Ao63 N'bxSp]?Ϊpmn]c}4KP3eMٺ?m{nIJ啚C M)n[y9?~Fv wcM߉\Ilۼ"؇8X ep^YXv<-.˖e*NGVZf]"M쭻gr%_7^OHR-aV3,6_Ͳ=¢OaX]R[+=*Aj3껊wVMoRw]nwpUﬓXӞ^T<}xXu!KkVӳG]M.S[2rǨ  $TV**HR ^#ѡD`*hmI^8>TjhBuF>iCz-v @Ɣ1H>0iƈE0QoDi!.iGQIZLV?D.>_R7b3gsܣ}6";r7׫lZ|y\A8AGRh30T.ƘdI [E7Z>y>2a\N*o\v72QtQ>\'g*_N^W:BgTJ;HŦcv2%[Ѵ)'BC=ISz"2Qё1wE`x6pV k!x (IG&t)L҂FN<6_Kw֯nI7>)51v0}AzvmGLPOYtv8LH%T68i6F{MZvk-tB~sn޳: oq9=Uil,s5!3AL+aj|n&Z!h+mޡNzQt͝]z;صX^EB(,(=@d=Wb΢d2Sj.Y1REh=|D!P)Ƞ.&X)&;k5`DՇMx/ٿZqV>49QZ+]]GY)TT$*9G+T͢6qtif߮zn oxͧhHI(( ѥ"_(fYs҂}98UҁQv]0 ɂLқʙ*FQ:ڦ x#|D[I5Άzgْ۫GA gKyV4Jc9^zgR4K !18@[Po`YJ0hAX (SeFQcv K #jd8RgC*xvR D _E%>!a)U6e4M ^Q7J~=7q%d,e+kĢB RCU2p#E^b<"~Զv+"<ڄ+l_l|-7A< ")˖0MgߨdOď@:W0d AN=V`lLBX;k\Uǚd=Q3,y'˄1䙒Mc5͓M/o׼(%ŀŶ'Qbo=j=f>JSR74%l5בcVPIUf~CQ=MF{2N֌[,V劑Q5ũIHNE09#C({/@&~QxcAeqjlXxw<%kIA"kHƛ2 ڳT)CTH,23ߍxQhtnd;}U$tcN:Oݛˏ_~/?f]0=9]y=zWt[Oل贓 ]uv7?u7?.͛{nydr9nN]nz8?8u^\pqU86mk'j_^T:; =;ucs'_:TO^dyu~(ǺGi֙uڞ!h1ݥCO] W gr:`m&A3ӓi\LA~8z:6zu;1x4>=Y̖T,\WyEdWWGkK6o}Z՝e?>.ROfKѾbw)}XZW׬"p&gw&#kхaP irƦ$)eֻ[\Tlz(Q@ٗR- :H &6R-c{NMSER)f**Cva&|[.<ۂqQiUͦ.YNavc;(dHηm5-vEE-: TX!yҘ$q4'${XѬe,!ZM-"z"3]KIE31 Yi)q$Ijp:] ڧfR]Ԣ].v^OAN &R1T&uEƀ2yt!TP4]ŽcOޒq&^ Ψ윎o>04!q$1yU]NjIy}OYxjS"t?!K'SZ _zL{I9|hcw|OhAnZwÍ:?Q_dT7_݌=̣~͆2Z-$PՂjfNOEbhK0:Lk'<kD|7`n. ݂*aym-v8-Aן+/Ph>kV^+PON>%QzQgIDc%ב_ުV|/RQ6_:jYSI#F$ ^砳2DvHV Qm۔vB(]qH[LaI<TT)gENio57o[=6 |-{%/W3~_^FNfkwo;an'8n1߷ b~' l󞌀6#jn6= B̺ˣtpL qxz >~}z=6^ FA{/e&fI Lmu"bgIیE\&U96T'+oEct{r+]$6>RW9uiڝyM^G4dtOC vRٺ/Oӓ:Ⱥ;\̯]>ua@p0lm`YŲWӅWKGlӗ\6w5٨rS)߆gmTKrp|@~ˎ;SM2qgq/O?8޿˟?G\أwyw׷ c,0˦ ٽM>kpg~ j MMVm05z 5_yɸwՇ0l֡*{]|{;Hզ!Gl *zzi+4mR?M']ܩ>9T!Jmyiۓ&~OR;uS+ CYO^$Ӝ!Ǡ g^&Ԗ#Oor>p՞t05{^F?#23”YLA;U݁Q 3ɱn6@Wsν/NLlH6+=\lRyuI/=+GzVu;c ZiB(fPyh,ӶQ o4gO uqO"S :S ,SjG _ E4B G>6I39&Y!A=Ad.LF))%JNqWH Y&pPb!SVڙbզ&M<~ݯ K}]$A3g0,ku3P"s B#! I^FA v-m y j2,w)(.C>N!Lȹ;mЕMgxIӯ/G=K]}37sz\GomIЍt87IS&N5]Ovͻ}k% ,@LzrRBX'1d`Uԉ'L0Q[c%"mQJtzǘTs@嵑K}mZYFO4 g-/@XNb ˡQ\I l$>JBR74$El5B <QaK:X /)$?krdŃS#.yYdFtt3N Y&7ZN?o{Ӫ~S7{$&ո?uqX(9k =`>>Z@PpHh&tʔy*<"騏 "(-mXĬ&{3$^ʴZگYnW`4+|푎1Y0&("1,XZ< 40%lc|6푎HPtkc]3Z^peh\N Ԏ|_=c6jXZcxMP4Z#@9:>g&L3b/̼B2Q"XM(Pb.Kr6P `JsS73[nLCo+WRx:: 8)KK:sHH[RBZEO 2Dnrn5Ns:pM]DVO < $FH*$r՚aZ#XJ)>SV)k5)gb=Ngg|RZkZ3FGi%7*9pƳpqrP:H 'nC&ɉF T:֦, ӃF &eDl(-=|&#U2VzXT$c[[h*BT/?ܐst5Zܠic}gnQIR6,7$ÀJM)L&3Ld BXcŖVfUtl'?W]̦&jJFd2v̺,Cthle]m:-v5soҁjIǾXmYx!3OK]]em"d"_k0fWDtN68+u+2-gB RYs# #kaӞ7q6#OeYmMKnn;\R1$A3Zf9#.J/BJ!P^..n [iz}U,S5` E?~GǕ4L:=m>HK>@' PGvPf)]ɰ$ju(P]-)*t܊PGag璜*X$o%ggrr$)!13!4wI)T08 ׅ&RQTW;km,"ц"^uK=h{nw:;εs,TֻjZ-scʄT"{IolKy7`x^2}-ܧXu; yݳlpn4OO1x>ԃ  -Bh̐ /CǓ*[xb;rB\mu! ňbTk(F95cAc^:_RΕK8`q.q]|p@|U.$x*jړN +} PdI BW;iqfQ݃/1}{-I̎~!*졢oDGvlAL f(š c2BPQ6#'i]ϩuF_,kbv|p||o.r<_e1AWtkr1JR*#Kq)QbYΩfXMw斍RF[繗ufP&~=K8#f|j~fxfnDko iVSٲ-&oS,*;A(Hnںd!½ BV `Jd̨I{H Z84f%>ΌJ\PJ8^⠏B ;pMG4uʣ|lwv߭6m\wD8;lEq?0D ExUfg>s$+Mqf {.x:O;鄞|sc\DfT1(ϱ0򎌈2zfI*#jlD쐧_PWv#/_+x%"X.rdfsV-ڈg[Xq{1Ԣf?  mFon6 o dO|681ؗ[NE/sR8DB')` {`0~}}s9s+ٽ{* ʏ(G^bԙ d\)-"J5ftTC gy078x2yЌrׂzqiPF ǽ\.m o[?-נub"fPVw}F>:x mJ0bW|qrvDžnlO5Oo>-۽'M~u{ٕGgﴺ=ǃ_'7[#y~p2?(ly"ƴz<~?~ߗM~χKRG+d ΩG=JךCuC/ޱ|MǓCKy`% `_Vyy^TDIf{%֝.Yɀϼqh~ݹ8ǥfIBǟ}.toyTۮNvSMlxZMywTbYtӢRAj^U .{iT13P{usÁu{l;6|{ӥ\ 6gD˲Զڬtn6zޱBMVuZNlnqcwJgmR ,n足DQ[K^,<"=W.g͢ zQ0.˘d ]@?NۋNi0xe  St{n R>eeT-yr'=xY*U<@Ы˻,ϛ)( ؗn|%[Ufڳݮ>wekOH 'jWokg]`kFЛrOSN_Y2.x3`U|hK>.#wX߻/04M\_ݤ5~]nKT4>Y-U;ks6?l}؈qm 8P3P0d !}nۦw(w] 8Wx9{s+&0`rw^{C1X|=󪩓|C-eTqxz4[0OLY<g]lYWuO jr0OAK>FM l$,,HT-mm>0DICJ-+:t&aMM.#ǐ7@Ti͜2v'x?atZLNͫHA?rbD?r5l1RIkϣ[w8g]1s.j\ؐpo3'P嗺gqɨzjE}(J əWwa۩8iͫyþp|#冏Kޕ7g3Y߷ .{Aʖ 'S&xж#-Y,k4U\G%ᔵM'??mΑ~V.H׾>ů*Bd8:JR\Z#ce[x%ÂJdtP3U ;'bs6D͒% Grx}O(CH"&)j8m >ߎۮJHryJKr&,>592,w9'|%Rp(sH.Kw*h]ksnq|xX:Ntb BjSJ LJeCNeN>3VMB$ݟSs[V0f|"fp̢v(%0:)l]{M24Cɑy/&urxe Z ? ]6T1nWL&LxC(T@˿+o 1CêkJlw*2|~l.U=ebs6:(}O|_Yc}";,DM(Wچ@.,X<1&Ykw}Wܰ i|8pe %͊tru6:l_ lxGˤup-0N),!0 ,ZU4E )U)|'\ H@rlQRaaO\seK[ч$Xa:3㝐V4E`K f0hh>u @],gZ"#2Fx0`` 8 FS3'# e’ [S$2c4ׂZp p;-s7JcL<XtUSJ<d.]$@jö N%, Ae\Z5dES-П!h) \F H(Eb ~uVc4LP ,ZDh@yM* o%= _KX]@WLQ9Xw(3e]K& ϊH8 d"܄.u->X($>P"25I&Pbc*gI\ZH+e2&x"͠j`ol!aP0`pSVY@ CާDB fE`ׅ&RQykc~k@A;Gtn02?%( a"e21:b)@Y ``#tkᨫAu-pSѝA1F!p3ǠlUZPC,1vR p@ /uJ&j+0l't&޽.__zl![L:[䶙wi;rXiNhqEYuDh eb~#{%tMކFŸ2vp d`: (N6GMƪڜ{Ō: hX%Y֨48&p-I \9vi7l#:{ 39(ڡl@&RPƴٌd1C0",0wϢb!t",U@k6֜ß?cy!kˢ9K4YF%Gh3Ko^5 BYU[fq/z;ZzW[ 3]$y(M:[P0sɺvA0q zyi:4ve~6A/NQY\ӘI(KD0umMW-d0L&Sns`*w4,E4^GuA$ZkpyHY5X`4vfc\tf ƂD[A,B2!S : #B*|Dz;`-2'ӳ [W*ϕ+"HƷH{u63 ;b[ѢaT( ȈOZ7g=kUi,27hâHY=+NFp `Niu)Fndj3Rk+kYrP&EQ˃>0t\4 Z`5#w\MlmMg TfD{pk l,~,gfFXV2Zc@+hB_?Ho`p=qY9m!nA2~f"|٥#hRd۹0g9Q}zԾh5l.2Lųd|}Jv[P>^~z˴##X>2\\û_P.r1uA5Pr4MeQĸţns:*U2{:5w؋;k~Zs!/%4w-`vxy͇b<7Zڛ2l5O1vU)rV]rN gAlN~H'o^i{i%vmb]0 )֝kO;foe϶=+ЅS|vm:prJهl[ڧG~D]Ema]v*1߁yDFA}mq{m &\eiVx`|߿o٢V[Zq^}6vXwJ"@6sP`!'El;X(bJl}z4mtk>< 7?Ǘ zoG 8EK\xE<3n_17Bue)S5B@S:E\o"zm"莲uPuKt]kϏ3(\{ 'Og=%-͆r-KWq[>zunݾ9PhN2{ 3[ƻbRȌv^4N =DS/Ũ5ߩ; C!m׳KٺXs^}9Z<խۥf)w&X/#0XO ,k/GG;5Ў:8we%`ĝzq ҏq%j}w ݩbCcVe0+$$e-1]JOԳ9X8s;zS;䓶̭%1Ay:vp7 f_2 Wڣv'G'<0laeFW"pg%vRLM OQsAc8}֮iv or0vdw6En]\CG?r7Cx~552i;nh!7":u4PZXI㮇ykc5#:ѵ;|ՅkSɼ[ʂ)#Li56e. XMչ4`wʘ~fsS~_ 4^M|ޑU4]bxS8dKOHɪGG#Ճ=~0{foS+{.ڃ﹍%!g8w卖X| T6v*Ɂ7}W/np|Ks#tdEs(HíRF̵ï|-ƚ4Y>Q(u<YCOi4XnotRH"^N  rr<יִtĉ"oR6 :sm@f[r͌J`OF̪GBaY粐F@@z? 1;w-1Z2YRYPr"bu۩k:'fwi?\]mFTw8.ϯ.gs fWyM1v e:OHZ ":?d ۗ#~.C*__gN5,>޼yyWqA/^@/Lp%Ǘu;t&rOw:JPݔ#CiNJkug47C Y5yQMgOղu~8|Ő2dJh'GӓN Du'q[(rU\[WrU RH:[A\}5QS|~O?xz=}3l5٥1᨞_] ~7>{F=/'M%$_Ŧbbb֝%b.qЂ[&-z*sx1W97*.*ԍufvR_ꤤ뫲#^ _+fį:h'Nc>hf/?f/f /ų߿h|z~鋗g\g}v:A LKBb8dd63!mpC~_EEҩ-UyoSg7{w,˝/-sk?|5|8Ic<~l|qy^懫CVYsvmiQn~fc_AVy?V܇ q@`ӭuq/G&vI$uΓEv%V3kQv`X[vvId.3Mē E4u7􇟗^ c.1KgZo{~fqchZ3q]GvgMJ0ed5hBL[Kd6{Kp5K-E]h]g/:2|{翻5'n5-g0vn#o9=dug62/Wqd88$_5"բ߶Q׍8UuF0G`v;!x;V2lux\W;].s VD.6&yc|x:T.YIx־Fgpg"x%3>EE&e43Pۮ.gM m^Djje;8@fήàys}Xtk,N "S:xV*ZD{X>a$Kp^a8@p飙ۗ<ghqMgsf]}y\NVh287kx{w nfՊOdqn1cK(ȗ9vkauᯫ_ t{w>o=`uDuQn\A`q#=QEtʛEy5罩JbZdI1gݛV[;7[ѿy"U4ip>KSf% ^0EI%3ȖXR*CW"^|4@`R)$rDzļQFGS1&x92J#"( -J))$170֜h ='.mxYdJ锤:N<"WnSm_gᇃ>LJv(t.Fq5fa=7 އq-tFs]Rp442V_gPPJ@0zr{Z4"{#b9%K\e Ev۹(& %/5Өw4H*0Mw-4`ݷOu:k2Ɵjlp>Ȝ)W+^ mB+m._9+p9fhzV9Yͺ<9/hi1ܬ7CfndA.8Jc@CأKGcӈ?D ΗUq5}TF*Hom]ԻȟeuAO~G<9N_P0vY4a#A29HysQ bO_V8-n|yi}r&?K6Ixi}2UCR+߆A |w={7sae 6br$tLvf:Էo^yLܯ4ہZtRVbyq -WxEjf,YڕhKRd eJa p#zA&ⱃL#΃ \E5FHRl|kBp0)HZU 'n ;G(DozBVnv\U"'j(dxDGO4T>>ԜsD"dlz>-# dRKJ 3^ǖZ#MMݩ] eڰ!@*4T{<Su:btwW^ _.e`k.E1qV{tHbaJ8+]{k֎(nƥ:lW_hoeö:ޓ@=0FCxo):Ǿj%"VQ -zG-/v>g@/99oO\sMjQ"P5?9j'Dl`,`2KB u&zBd2qiX$AHRVa S띱^&yeĀhj4BZ;Ntn$~p$DGґTnVɻBx`c79>+~#X845>51E]{E!VvlA$'Lx&VcS^|h'U8+`Ȅd\܈/&2UU}Roz󴭆z5>yES>f> 42k57h %qtьu#}>i{Ҩ@=Ǩ\˕].$0 >zq`SwvӒP_N?S u}!xPT:nnT +.꺚QRi6X.m΍ڴjP'+MrWj@zbo%(]Btٔ^= QnYWKkj o ?0WWy̷t2R->wyoX;*n֍ufmDzKjh.(57e3PX;d6gK77!wh7Q.vs [4Hcwқ2ɒIaKM-e3qwP;}ۃ.|Vb- 9|tgFb=aqªsY~6D~)rx&5!^+k)L@yM>jb w MX5Yʉ`3vKK].6@{taʅݶﺸ%Zk\wlnic=h&)~VwOΎ,uյW4w?5 aSswb9a[V{,QM[DhVXXSMM7QA8һ1^2uj<&-y"1&lyډ(#Ƌao΄ץ Lx'l,KT;z{<4nd2++߃ޮ3*r7Oƽ;XFpn =WX](X2t+]1N_ك]C_fiV Ҭ *X9ku"Z#ӊlX1WA]_m$s%5z!:fԌ.Ud U3(ӌ]5cAm׌>s%W$CQʲG͗А V de"MRI&R+#B- \>{T>%~7g[63Lm//1[$p4`8w(u`PXMsTRh•D>8V:d I/-:p%)"Wj^* MoxC^%:=]"R^/+JⲢET6͕[,XC c¸d"D8I#(D#l$ Kwl[=ShQZVA"ǵCk\@j"r}Wg4;`mDIRVU JKF9`[HdDhԈ H!UR aEFz9֫_Ղ:MPcRf+r *13CԦ5Ga%,HGQX-S #cyUPwO,q9Ӏ0WFA=&0덢3CșTrIsZ٠ ~OZ*c=wW< Ā7)X.1ER"8W1x$Aasl#px4Y Gf4d->ԑOlJ>;F5ew6e}pe:O^DH3roa&F),7#SZEWGm˷#m2?TEȾ'3+߂ )G!jx):ψXCAAq=ќ#z,ؒi9hG6 W;TfP_&hsK,DBP,Ξ;)t H}]'˧:{Iot2i9L!)Alˀ-fԔ bT~oҎ!)Y I٫-_GHxnq$aEHp\X" $N)"e>  [:7 #$ Oj*( HmKM;TgE jޤb5j/tY }Vg[$R{`D0n v"L(BQ` #;U3G0_5.`h0̝ V5#rp b6p̳'Z ,Ζ'  8\%\=F󓼾_o5i'-iK *Ճ$ړZ^5$IJI2nj\>Osؖy}w7Iz>N^D(EqY`NZ˥<8.0!k)Ch`&oQ0!]GEQȢFb@P^j={1Ws:һ]:QwXi@d2`| $/i;%JnwO0}/Kn˲eʖJDV(V\ޗ~" NǨGgp +A'sfb>1,{b*NˏPYB#yx\V1IF}ܤ 1 Gõb!!š}v" y argF~!cc؈V 4Hʳ`jS_fI:6gx򲗱݊oϲf^-QV1NbUOc{.qnîʼj۸N*Xh1 QfrFT9)Y_gMypIĆ8(=$2QI0qV܋$.],,rn |DTynb(7Cp̕`@>)(O!>*os@B0'PX\-@:VbI? # IBݸD;5I@v1l%}V|E1A !1B*i-pl,Նs7ʣi{V))P5)W@zU'_Y O״p3fDKoIHjmY88`cDr"1aښ+&`1K.zR*>͙\Rp#c=R ]TBc^J1yaAXٰ gx^8b'b2LY)R*5AS晙RIhDЄ$u[Z`V")!Ն;v' 0 "V["Goߔ9J<\v,s.D3cqFӚ0,9צ} 9KJ6@qR K( , BIs#4iFz!koVvD8N4ՁpmuHgVɎ(Ye\t=.x늏^Ā.ifLsq^ U2cI˺7b>YSb?86s= ?JWDf:Ql6N\JIfmStP"{!9*uq$$ bWJI<8 rpMqJ5u $BIAPR}h3N "ON"jùc4L_~> x{rzR.[Ԅb7X[hYxs}~< L9 ̕XHZsI?LM2딴ک$BIy$3MU!&\RGD0)3Xj0R($)d$pBX[6Bb]Yhwa.ұ)KrhKѧ;).O^;E;t^w}*U Lq>;U2RX?a^uAH>KghWrxK,rRhQ@3Naeؓ?upfgt">a$w{~E>\ }:sJa!#M!M;,gW lcU(4>ǃ}Ԙ˩},I\eT2NOإ|i͛A{z2_Pб?~AòZ:c0xsƳ`Izyu0%%tα=? {wyx̹z»&BҼ*)?Np9Vo?DBE g4ZUrz$$?d/4梇7i8BSryt*0t֐v5*'(M5AG2 E>h5,ʠЀwqsSt1 2!V~B3iJz*%k/K( ֖qT&)B .V0τ[[zkKs>\49+fb斁2Dd ㉬x+D^gEt zkK<٧XZ&ӓOA|6sw).mt[֝gjh*Wr\Q8Srk"L' &PR)(lNC(jB9aՆ]oY/@,ojl嫷#rN1٬W֐'fA1I;~ǍD^8BB24 XB*dB XF %S/E1 rkƂ98,tdZ BGnu!%o#8N.r|U]@tRմ,vul' \~sv6YõI~v zʦݭSi#ث2dIbzD|TSty͑>FWML hC`lrh,*I$yW̮\̎磎8,< i%˾VM/oŞdY$ݚ\#Nj9;D@JXՐHhJ=(a8g/vnS:$ƾcɥ| O^-lv~Xn{!:qٍtz即A1r1 1, 2kcNes$+%R Le-/!L#cFEo@fB"R>&>bgB.ݫ7 z ~t)GZbxG{밃Z"qÈjy Mf14 J^L'y`IpkgFx}x1гzzyqc\Df[cbfe<̂TFXPLlyj+0,hݺ9+2{%"XΆ!3Y B+o iѬ{v?Jj|'ԢU-ݢ.=-oy#ykg` +nO`_8Fv( f1o1NYa( /}@~:ڃ2a?[+w?\\_vfo?uշ51IcF3ETi?OdzqWk󇿶lJ.۝-oӣw'Uߛ_.iu4)ߟMf_~Of_OO3"_NO&|˯&ɰ,-ďۿaj_Y+䫾ƺF篾XJN~_doaB]yƤĺҰ)feWڎ{M 2\8g®v!K_|=kʏ<0.Βџjɷ4r>c/M4U\osQյp*pasNyLK /%Jy!,uw[FP{S@n흚-}^m'].[Cˀ OM?E{~Wc!HО|b %f'P't@*Qcp[oFg| FbԎ3IBh{iT Um_4&f5*lHUȍ8O0Մd}4(*Eܠ5wLJWJ;Di2(gQ,*r,%^$m\sfWu{i-A@K@eH}9̢1[Nn0 O#|@]m8w',kLw?ֽT%𖆏L;TmLBd[\gfS8՘F:ղ['F7Ǟ=;*.foh:&Iw)ٓJ$k)p2>csj A9H76jؘ47YWgV6!v`8.4̨]KDv0z7fMS?Fh5k~W_/m+jAKmWԭ[iOR -v)S`Nu=zjz#=MQ^] o9r+Cb, HpC8Ƃ[iF7#)<,ɚǎ({mF=jvU+q0B HE1t) 0U%88k}k1 p ZEAzpX޲=F}!o^-{+Ɠ u h9# .C`’_-נa[:n"dІW|_ݢf6}tC>WT]yv;W-+ʬ[z?>)#(b٥_}6ѷV%Jk51 g[rR@ӊ֔GLօπä|o/"H%!8;)=FJ0^Q *!QAKX`r +8ms `3,Hyh.酔{oֻ`y%=Y2,:l^E&emwө 3MkaQGa"!;S.'R2bx]TVL-ᑰ&EJلuUqH.z!Vz5F *@IV3qvOkUz45$ko3_ K VASc~^]3msn&S^AxtCuP\7?U91`k։'Y7  F&k]HRP ^,`24Zn3=}+!Xhgg>H'!{Hm*" bF<6 e{Toe%^+2úb,(d)ʤ$v(\qDIUc;k&Ξv6կ߁B(jV ^?e~[£ґߪ 6DZ:8pSDI:#:ǐя&n_:=WBAgX:˜S (g1 J* (E aZ͠V1NZqIg$퇌KPZHkW$CRa/`ICGqT4t iY%gAxalV5Ȑ(#2Ǣ3mupt)+Uvanf} EϏȡg*i˯\h*+aRͦoe2q&Fhk5G;Xu1XG3 9:\cpIК[K8/6%p:?7;X/nL9DM8Y%$&_) &vG m8(%tjۓ(G@{M8/Ɣ& /xRRcY!IMtTrjFt g#!ʦ85s@IXee4P0ex焆Cev:P6gOm)TM Y1X #eFʠ HX2rѢzOVeO>-N]vC)O ]Oox8:(t}N>.o~ޫ~ϻOa2ZtvݽNi>㏫uon~AtA<姟{-p򨭓u'oN$''1H[m~58Ow S/3__^ƆW.~ޅKa>NҟZ;R*$ztk9'RGScT,U?XqHx;P,Nusݻ[wX g)LroaNwod~ЇӡUyLÂz҆I v"N݌\tw!|4;}z ӸTknvYoM_auٓ-z/jb9 d'{ESyOsSTB8taG%i2^ܝί9< /ϭ~KOd( J*ݝ\ dG<߱?WI1=R)&VvSVրnB ~ 1"W}}>UW{YS,g7®$:םR]Se:JZ^X5rsWMr`$\ΖY%&5M){+K18RddFlzO yq1( FrB+ mSGIi'@J. *:Rf*[%3֓ٳlRTћ4^cRɡ^;C JփRZJz{ 4_i$A[hH0^!) '-!/)3ԍ!N:.)A1cƎƄ"ZѥV8xN+HBIW j#G EF~/lkqMh.NmY׫z-/Lb!(\  &y/z`J$8 z!82(>mǭ6@Y $I(NM`4IL%g뼁 10"`b]mߡvuӯ lt> >K穘YI#ǘœ{1p|Q%&m뮉Gz/7 [@tq?k2 F6+V!3¥;pg4wA2z~Dw ɔdJ|%udu{G؎S3~ O+]3ZM bl*D!<AѤ$SL$%]{+rQ Pri2ʮjIQ@c(Z{fJ3_L3/4/|Q_" {si;ӝr}.߹&Y\v`Vh* ER$ )Ȭ$pm=r`y\+zud0Vl v"zSXTk챛c$ 1v׶6@fX(yY2("Av-j[)'>[#] dmyцQt#dFE'E} .I%j1E⠚lh]"LpTG' Yڧ:JA@%{ :E( ">`Q:*HHmW4|"eQXLbu5KEhGZc*S+qvcg}DKM=zB-oőJq/:Y!WR! U] }BLvk{KJc{*=֊H="j*3%؉JkdR4u" *i @%ʨ2{(J E EeJS@z#JbL05&+lٳfCfӏw?Q}ye]:[ 7ToXm<η/jU+YX*@mk$c p!fB53"ئF~(rFKEdRTчUR!)itl#lS] EuLFF]Kcc:ܘuÔp߻hFN 7mct%:} ~q~صNBB{rkzGsK5a]7'uś>>z16.Xu\ǥ#\:'[6p!m83*< #6#db,#=r}IZfK!>aen~zA{)8xq^=_.luPI@"JY QXa ЫLT Q,yxwۏG^/Ҹ$]26JL%%&mOgu>[n˷{:X$m++E)lYnB~s2bFdmJ!^P!B!䱆^berqy]C3')^#DԥDTdP>T֚(MV }W9p]oW|y-{!MH4)V-K('qo")Y}(- [ofgg0wa|hAނ& 3Bs4v}s\@+ tRå$`_z|$ )u%_S;—1=^'*9%w$NNw[&&*g̻\s-r%!,rA15v.մ +Oqv6*C WNa*h/YC6t1NZAk(0ƨgq0f)R @NزR<[9)7]Bc8W{4@Y]j~;r{nƊ;7bLFje9(#!ăBV(%FyC"vL-Վ0Ri-[+2%{T ,:$wF A>R$ -G>%uA3t $0{?NG^Pus;UJ%nn<ͫ @D,#Fb")*a%5FH@* o̽jsXo3"dYSƓ <7r{SKY.>Ds6YPK9f-gYl4WJ@Uů󄪊C/8D5޽[TILWE%Orַ?߿z^7<㏟^7msmP23cnG$/6n/ch nd0yut2L]W>%0ޜ0#}ikiW?I*FO-/~t|6Y@^o+ 4pٳH%6w7I"&='JC[Xveg2Cn_.;5$Ȫ%~T35qxV_ OgC_x8>tF ʥm2OPZbmXR BUG۫?F'Q)}dw;:ms{꒷Fc'p8خXZS5kiGkl{uOH+%%> RM\3u/zf"yp (7\;P ipˁ6Dv%ci]3MN?^HU:*4^, wRӔF+%Z- <DB8NMhXX Gt B8(+J"t](؊%Z/+vm0[T;"E FHd 3R2K`fvK0dcۀl, POi.H!;% TYV^ͦ!iI}+O. %f(&ƅoX\#dj6WnZEC6X 4ݧ=c{X22`)"{w$=Sӽ0y eE-6`M@cqTapA?t8:*1, cYc.ƛl>pM`X%X_U{$:8X$vݨt|+LgW+  sJgɼfO~,<W_Zg#`N(GzK g@، P[ONۺ!ݨn]Ha FRF UL;x4<X98̮ƭtu$zmH6jIs SNQ!1K}@"~Ehycqc ͫ?I?߽u*>K-W5QQiX3*wKCl:8ֈ&n'1$l(2C,RJ#c`⍋Z`L 1"pMtZ%KFZ tAW*@Bof#`Z`TK%FYsNuv6[g4m6m\u6=a͡rʤ Xջ܏-num:ra@\Hc|xHP``ayxSijX)rǞh;u`RB{C8P!4VuHtw,1yXyû՚1Lg::-U(R@ 'TJN[0h !',Mn>8%tBH"ƑrW^)ǝ6*1t62HV"%VXz \#0ؙ1XucI"l;2{fLܮ{츭OI`1sD ؅("8Oc:Nyx:Y*x^ןWާ+5#1-t2\0wO9fMˁYĞhr̙P:ܹ+4 O4s:Funñ< 뜊 .pv[.y%FEdԑI(0HiX0bQG"ʦTQpeDDL &B` Xy$RD;g ԊL}tT{TuB;%rIEPjmi}p xT$C sk>3`' S=?`}q/ ֠^'K)B]x)k?NvuHN+$T΂#"SQD\~qsUBc(!"`"hadDkז#E(' %$ju}@&Nw+^w:nRo"S{nEӻ}[%oo=`uDuQn\`q#=QEt{y+xwٔG-?]'- }^Ӹ7Qy OR4 T ~||E|X\bH$ٰT_VCS(r͙E,7DAY 9ͫ=h楤hUY9ٽ;v8混o~ )Aw)|Tsz6˳^?;xv>{=g(lGL 0hJThǜ&4 =BS Rieѥ =wE MxJޕdhF! A!stWEPV<"(dVy Wxq 0+#Ajpk,H H$6b4$29 @vY̼LOQ!` lIjd2Bjm绳 ` l,%5h+S%4:a3eo&Р*B3?\읲&g ӫސqO_O;v{oxE@suL2e<Ȓ+M﷠H=lіږfZ[A` .~XZ\1/N `raA$"$Q9}5NŦI>)3"[|5[ {:v;m6; Pi~3ÙDX0,RGi$ x^[aAùU៰bL880\udJpTbY-ͩ*u ggL`&EGFlj)׳Fg=9Muxy`[{]1XdYZ$[*.g/w\ih31|(}g,5MKir 9 y&%>iqLT8@`,iqeJ \Hq눑BU#69fpiIyd"Cc g=+pJF"(zs;m3)!=ϕqko6a>u!9ed[$:} ` f}U#^UGW#al5.b20\> I}C0jպ%iq1C# &]a춯%loOjᴢ҆I<% Hk(*6JD!h`50O^I P1#*yf~zyᎧa.7BIJxIIyKhW3RN60IL FyaCR vP bQ@i@95Dc¹ >3ޤ%P&zNiAOnU%89f4IvcpU,F:O׫kA=Td/b8_k1n7ĦYq~jſZK4-iZaQ53;YE<[(g\ޔ5hkuza9Ihx\+`ZSvV(嘧f¿l^j^9X]_GT5\?큚E|]̧+؞8y_'/_~eR$楎 Je*qP# nl:xyůvLutŇd]ꙖyE:^{~0o3:/_9始9͇_mg-oh|$~/Ʊ:`Be_k_s754Zܿ#bm~|fn$ u/Fyy^g{m rn4tu6vVE;Y;l ↹su.|()fBZ߮_^'Uy]%O5siRSJ䆹O]jtΦSu\}EKu4AS!T?XPM[덣ȤZ-0G@ۨ3yq 7^!2Υ ܺ$Iu&AUI !іSk y&< z#vHwU =2t7fgWmp#Ep>}R N(T >%*q^JhH9 Y̖e =.t$[iy>iJzCSr+}kcrc( Dx0ؘ6Q 1!+=wrnXrs=t1ZXy isl> 8ɵ{8^20~t2btnfOzWxmygd?v dzZKՒTv Z[h79__lHHK XHe\6,>R%NV`B)Jd*m@sI͉5c91ޞcbpeèB1G>Da9O^keʌ# &4\MS#e5} 3xeTʤՔ) zvFΎ`eK׎ZdhdJ\Rp,Į'6EׇQo>_6l.pYvSv8 If !.HYc[$1bSTڧ X1ے@R:~zenf\A`ZE6L6T8K`TDDYP;cP9udPC֝_)n86xDxd( q ָ,!3p9>X9QؿjiEJ_LRm{H@h R֓U6 nqA!hcq%AW0Rd})!6T@!BQm+϶G1)w1v{DZ%KOg,E4{W\L*{yQ-f1YDMR=5|\.7Pq}5WC{$?r"%OZ'L>rv|0P/s7f gZ 4_!s\vMN{vjC|4{Dv(ٕLw35E^?k7ϋ+^^O/^(쒘3hƣv5kdAM;/!]#)9G:ucYg715Lk3g1j5l4,GmkgՊUVF,sVi*'=t 2yC yEgrFȎ{ׯuɿ{oN(3'zuۗ g`Cކ"O&O#S|} Cs#vZ>]ƕ}>rǸ1rڠZ .x9GUGP.`z#oVE駘A hoirM5U\Z#T3UA )@^;3\%fMN&gn'y$I81PHQ851l.gbHMV啵yHNV!|@W "OJG H- zjIаT댗qe_N{fCQ}f[Wξ;~>lssX5ճF]d+8LaQ3@QjA1E(!G-HSEeJ4&aY\ՇȳG.W2%It$"dSZd.0# J4J5t {&Pk=381G^smfEUQ=y缦 yrZB[ZB)M 0KS-7#;'-w$LjTZEhx:dN6 퉱9#SN11<7A0-d1qj|.pT*/  8FΎ,Mu7lk>LKU4<S&1D$@#һ$D4E*7xw'lou U=]dt1 q/og{b5/ QsJE )Z Cָ'7W>t\`ysgLE*MQE%YkҠK4%R gJo[ ula AM4K弪][fsVß7qx@nl . R*\t HV% ̤@(Aۋ69ǕB.2Kə Qrw;Q;a5r*LPB̽20>IQAt)^FODʠDy>>\*}k`성] `?_va|@F'6Qs|9O'qaǓ8tb&,qW^N|܉}m\1Ia@\Pwv!l_Eﱊk )ov{j|7~( \*اՁ|nw{O@36Y>raBnQDV3{uAt3aYm#c#?Q\ x9kǟG<;4(i./QMnjP?o+mO&TT_i#iNESԵ`l1{uck[e K=L;R;N|k<,x1 C%TTR2 A *PuQ`(1&KHNڢsBqn6+veZl>lPt!uU}}d|U+j|P~Pg/Uܔ2 EBQ3GZDcatb7 7j?,O{zU\J1nѰ.2cb1$%ISl#GbrY2qbdVDV 8&u"Ufh%NVjl)gV,~w1[^ètPAbNI4+oB) C9[bPhr ZJ^sE~bGr8XY [I SIyVd\`DjUZbIL6&“IY2\2cGfYFd>4>4x-tTs8(KR']%ތ4.G^G%@VX! ,2'l/mVD %3MP*|-"A4 24K&4;QsQ"90 $Hţx,2*toq}abԤ,^@HE@BlD&‚ՃM`&a)=xJ!) Ķ-ZŕnݒBHﳭ>$Jx^CRNH>$eEHJ I$LDQ0.t%*iQ^"Rt9}" *N,giDyzqҸ@P0&22304dc2NFNsZQ(KPK"CP*R&(* 3A3w*љF`K(R_\w|Cj lârg^ed1ς1A٘ɀg2'ՂYM09L/c)c ,kE7|yN*U6VUO_fi}ma=crں8MŢMӒ[Lng'6ʃ,lN T2n"@sfʺB ynFgBeP8w\MAI51`ǜ"s)wIXY{1i p+UJH?RAFMR.=wd`NIJx-GT ZuDn0vyZlFy:`){I!LN \w9U9Zy.sƌQhIJ$\`|$7x3RJ D1$59S+w)K.H`тI<&39JHsP>Y3*ta5UʺPpNۋ˂W2;z祃Y/\c  $l6YnI9TS晙 Mj "A2"l]-̪<O~ %{QRU*1PәHe95v-`Cڦ=]BfxN.H;feʒS)n1t<]ȹNR QyȄ "/:J4kqQ 6jXW)Cbj4jlևS_v'E1FjD[Y#^#q} WC6xcQȑ9m5D`9\U,9(,GI (E3!hjH1'-pK95GYKvԋU֋׋^\ыC-cLy-8gFJ&6xRHA(/׋ЋkqǁCc=6.~|Gd4wR⒇4Hs +:eZRD4o]F$JMu}c_0B"ReC:4I$%Ƒ:(+˝6foaa2ڻNٌICw4E69u3*)j8mK~GGsʃ:;!@r%ĜxPi"ǓTf?by}.x|jݭ4y-]O|i(tl<:;o$؟~-:r1ןuj_jߎ/:uT[n^\ti&&`$Yg3o!W'3\9 *+NtX럐t_@?Λ|~/A^ˏ|Yto 2ynXT_Ivvnu~2k/fꗄJtQ~4n,wɌC9՛G qcs=,I=?"?S_w2 f#-nqf.-mՀ@I!u[S齅BY7V:+}˟}w͑!*@*疱r?lm$otZ+W[6B1ܸџHvyOR8I[ ރWy޵qdB) luCu/A !jFA{3)6![1:Sunߩ 4폫nvՎ` G0+MݵL wdթ r ewf/Hbnf'L˄LA@xռ]{w?f|@W>A3ĸD#.뿟dcKʄJZ1*(-uMVVE8M۴v8]Ha)xzg85*AZ)짙kWj5uQaӨ(c ٤RY,i QzǭqsJ;Ni1AǨ;\?^.lw:0< )7Uu :IMMԄ1'aZj{-=$Bu 5v5cv[\\T5,SSL-^rJ} h c6Ѩg}Đe0hB*=6l&Y Tu̙@  !EëN{'lPu2>:X!̺@ְAc:.>nY:J[b5Fڋ:fXyF9Yr6A)Ԝ"H"hj*uoD $Ymu ; ףI YY[%c3(mZdjڤТ-)$85E!'5tBziJڈ*NaOI=k^,>Y0F.\>E=uk,RsS҉e`E¥:)tuf]!GTS( ٥fى`ӌT"yˌZOؽK`rNa-OX ew:D'NkO0vu`;4V( I&j`eWi* hLyֲtH}G| ]SG:k4cW 둔9w "(+ZŸJþu*X4XYi)̨`LhHpu28M#"X! QA\R}2ɮ3&ZV2P j,:A9J +`WVw\ 7 a q2FN[Q9cgGܬi?%17SL3H  ̚` UVr ֑tGJ Ltf+%RI9nRXT50ofz)td;?5 YV+a 1۽@VQ=r+ZQCk>8a4iȠ \tiaH+fD$' GcB*&6OvNRh}Y`;l!܉/Ǯ]-zxB$>H`"<7Csu@Kq@xc#PYiU2Mt%ChRU 0ᘊ'8$;৅ŗUXvAyjN҈HDNWxMʴL5:7zOC"AcJA&Э.4H#38`G}¢"YԏDP-VwV4<̶eTX*?_L fE6 /ѡB-c 0 1Bk)h0E=[vhX< (3 B25;5vEQ0Y MYaXYv sMβ] ˉ6k L{B:b:a$ś"@[rgY`6B fxZX@Vp*FX6:ay =CL iDF:̈rG ÉrSatEb*qKچkWh([0x {\"N mj\4bUf\. D/6 0&ʥKHCsgN Z"!K#r6Lza!(L #"B`j?ozF}y7/;䪂w!Iz4 %Ӣ`d[su@o}H0& i" {ăv_D3hה"9&Ev<&Pwt&(e`&зL f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&зy萘@ػaA ڼ@Rzf}L d L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@v@2p8L 0 m|g Lo 43 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@.y L !&/ _>R*f}L O=3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@˭?ZG?-h)z^v7o6y.zQY^..nI񯔢ڭg$#h5/Ӓlfqru6ᯍ_m]F?(oi@B|sl;JФ)"qo9Z6"Z9J[YrVuޚ" }̬(1 qNtKUЍ.Q<P?ja0!Z$R=Pm=5yD1X/{ݪ 8姣xO7MZ7H 9Г/z ?{n凵_E•Z'Jؒ&ЗLp#%\V)jv+ѽI9֨SOGo[hVΦ7OЧ 3ֵ{"aFݙيم5eu 1xV H[ /Ov?`+uMkD+%[RUH\9Cշ:mo5/nNkᖪw\~½_\? a˳s~G[mOٞ=ks;S1s' I" me[CaFǝ,~TtX>9Q!JSyFfQL Zj)M Wm:5DD'VKBoA{4Xn y4Tq;nC bZ_r(5(8--r5_>骙o!Oܘ-ξr @M`!|uT^I^I}[vl}Z턣k .sGxTn <&[il<૒pPj?.ҏY7;ĄW:x.NZv]jf U; ,Rɰ ŗӀW]OeKgi65Qr< R!؅RŰ If]S#3 uJZfh2/ԌHp6qFC_ F'E& zZ䔡ߏoy0h|!WKz\+wI0I1(V r-IցJ]OQ0vRo'7wWHи( ,K LdfI%&aP3, N10YEIGFvnl4eO7'^ڊYkET!n2S$@x7g.򠰾֠\pafc~Q'u06y K2ga4E^숀lWkxͱ(_)rʧ;ۇ+'R0-uTY,Ts `9)Rz"JޫG9X:L7Ǣfȡa kyW . O`|GФn]>ZqFH!w6lل)elqg 2N7("$3mHBՖKU f9QZRĽA]R<dy.)G̘MX`9 +{ 7)$IBSHQ[k\Xf: YQZIAZL;< H@IBd`Fxh$<`a g-2)!`jM-UȄNk.:e`0M(Tu_z\OUTx2]~WU3 Xԫzy[u*NfҴy7Tk=>ٝ}ݵ %yzv.IZVNpuuk|}=])|qaƍ`e_z#J}BzT ֎-DqU , 񏌞W?_RV] SS6>qM^/ 'M5F&UռJO^-o"u>i8$$\4'Mf@Ǔq}1gWêʅ_ @HUM2qr0j&~!KxH}ہj r6 vh_ø`v̚-3Â27pCI(z2%'2P1y&椙8:IC[wזNI%c !K;OpIJxl^㋗()0Eh:_˘U.Zb b֔;;KKq뷞]p> v)Y;(?!D[ 80hK# cqfNn1*5Fg/cw5:xƌ5i}v }q%@奈e7fDKmLhŠŚZ̒#Q>piԚ(c@]F%LpolbNJ FS`PQ[#Qzӱ877,t,U!HYX) -<!&FG1XN;4h*nU`j'FQz;&#&DX҅3#j$z$llϚ`=^ǎHf`-==/ZמpMl VUJb U]şAa<5]gZ,Lu vѹZ$%nϪd9XIJUL$u=R7FR20C*`DRIQ͓ ,i&4w #b1qv#.6u\ǴYlEb[g|p,h+"IRJ6pDZT3`8yPK͎#C~`UU nyk^ׁYQ^{8ޏQV6>(?qtz\*9ʈwDS/q`:m8~}~ײ:'u|TcPh,!fg9E$H3x6J4 =A`G~S㱶3u˱"5eت30npե{I>X;esS{kk !jwj'v;Ywz<)kxՁkۓePQ룄>O݀^sxAӦztV8_v{o?@l9l]/ל\E$_UfpWIj-˟/>kTvTg$d> y^bnMm3xşvLsdvqj0^~qo3nZyso~OaVߏ&[/m~>-ѭ~z5lsM8Ĉyi~pl=~'˿9{gzW@W}ɺ[R8~߳txm`%붊3OBe@Ʋ{czgvcǴo -@͒swns(n2oЙ뮶[$Lʵֻ{ffGy']NlpݩvsE¹k{zmXum~DYLDyn$~N5(CӍr)^ElF*O)1^B6%EY P.(V4,\_fC9.ϟCsOGOx;& U`9jF1'A&^[3_MM]jn<;eyc#}c(a}gx6p"}z:tTkajͯ DAUV\ N\mu p ,GA?(hzt_֓z~/R>*+d-,3'9;E/S4@' U%PQ *}.\>齻X=xo>ݮ4L%3ZM@[3V: :k%7|pQq):(ս[ʍuc+t}2? +~nmdZAIl]KǝfuN|]M, 0'"uR"[UDY-G[z$+?"˕ :gItD܇HФur1-Xl 2#}>/r[hnuZ9.w(DVrßs_;? ͙`XD! ::̄Y;RyLRKNe+K%URu%.IVg$LNj ((@s +g˳=M&{ڣկ޶ܖ~bz][DzD)Ɗ[-^ 즍^X1|}кҨkigf'HoB&doB iϚE4 s10)B[ 9nQL)ƍ옍`If`q]2@WFg}KWFr)+IQ\J)qI7@9=ϻ뗷s7Kx-0?{ȍJlK|? 3InI%r$ypRa,K-[In6Y$_Ug: 0vutx*A\I9Y44 3 {P9  1Y bA[эAo mྒྷWЬ%DAFJH/8xm9R)iI-><\Zp=q<3 me}2u[7EfV܈\ş1[1Q gf aq#=QEÒnx'sXMzS68y>W/z38kTeVJiƷ~B .^qQB4%Y/e Z0EN93XZej^V#[Ϧ0ZV L*PDHoHY7X8|(ZFsL`&x92J#"( -J)~)$m`9]3poڎOp|=$vE| ha AޟiECsڰ ]\qĵLrJXH_;1!y}7'''^HW$*n Mt~;s@, 4T)b+tH"A2rVk/;^sXeSC/FYrUКcyr Y=R۾/x ]"C$&Sux6Y5 r ·ǷMS_ͯU1+$WE]vFm Mtgz|ԷbZȰ}-Gߩk;5'hLm^iNqHkʞ,q-+=3\<8r\cKjW$\_/"r(~[Ҍۚ3O5b4ZNuTh"X2Ԗx#)VJ&y("$n6hϥ&L4E!ZFe)A(Zb">xֲ5r6Wc(%Jz.#?]hA4y= MOzaZ}7nAUnrnϰVq2S%F"Vc‘r!6:ʊ=D14>𠎝$w2)@MtL*"Q* K~Jb,aE ! L➺KJH~ianBPE 3^q2Tmp 3DG:,% qsMJq,5`G8xړ`?:2ARPB&6= :#@k%uR1籧b>yX yV1a%`7MOn$C65wmiԱ&UpmW&"Yv[$ T\qVˆ!a$*Nl?ָT(-эqd{=05Lߔ6/ǵXEE,,V7l3ўc@HFD0t|!c3kۿ?a>}OMVÀP& бtH~=K\s鰸(3bXգԘ&+]i: ٍ g#fӤr':=]< vUq6*M[>\R=cad%0\JH<4)Uœy ~,T?Wo>/_Ngٚkge5Tv@Vx;. o/MQn?oBM- [nj47g!%6-#F1IAѿhf7^9[%h}N6W)[%w0`R)/ f|q6 KXU)k*Vuz7减߾>ˇoz iªq(Wq'J@Z# s+8"%VX)8%΄@EK;H2"Ahʤ>0Tv́dg] "bi Y'v)c(V_g oK5_31t2YLg{AHil)R-YĞhrSrts9W.h@4hY^u><ܖ|9 A\HI<&8A?(Fɥ)tԑI $)qF)Mz{gA5DKlpg P(+tQb*/-T֖FxJE9@0& C K<(_xB<}b}k]&)]/tatIWm=1Ȍ"ќYrC4aQ1]\.iagplEs` JThǜ&D M3hRX2J=B*Caj3jX$"﵌ 6MVHK>jn5r v YD uqT"P*W#7!rveV)Bo^7g2[ypA^܎' (dx?I©  38>Iۛ)wX !I"Aa ôeޔd!ziΝXݏ&; Y6ezlatzS]^UC9o%;)d۪| f 3deQ%HW [dTDd8R BI5")1 @)<-q@B*c%ۼ$LѥΓ4?G[K-qm4ޗN^llS}֛h72YxA54dʼuW?}J-g;O^qYsiMq[ge3[_w}%}!Y0,m*kK-*LYv L:S}P B-"4jl4i2L(^\q˒C)OӒSa8b =,o_AoU~p}RmiB?UJ+ś"Ϳ>O#gr6OgPbVL̖\g8:P >^rv>pڿy]3:@ۙ_δˉ\_'\~se0_XO1sAL )&FR^4M"%mܳBIGei[5CyV pj( A*%ڟi1&<0U#c^tj^2A0ɵ r B($S1f4j|5#,RFXp4g툳93@\֭fU֟՟2| ɰͣ $|^i)%5ʴw=p;,}G|3ːomwRG)'H\>GP-л<Νȕd, =v鬒Β;5ZhS=z}T>yn҃l@͑\r*ut08"n5D32R 1%PxE9!x }wgɭ*}80 `pSBVa8'}tq@\Poz!=~ }h j{[Ef 'EG s2XFe:ۥˆgzȕ_QEEA X4^o|ȲW#2m6`{4tW,VqUYT }BehB%rP* `k{+!BER!cȹHg͂P -_Σ{2T̛C帩{҉2.HhBAIm΢ņ`cHR=,0Bfmpc쌴LAcsQVsm+:3cf_5ΨcDHl%2X(K5+{s5GGuqM>:Xud zh\GNd|`v*Db 0sg;3[UJGMz H,zWKo7'KsMDv%K9P2cC[Nc@lb҂X@9yXRk]NB>i-g5jɬfH:Vsӥ&F15I`sġN ԚN 918s0x( >ts]{ZKec-: ZV9I((5[}Ms{"M}:" 88v b36A"KS6U>86%16 (Ơ9<ƅjp1Cipt!hsxOQS)W ֈk"ѓ;w sml*M>J♶>@$ ,>),&^Gc=y{ɯO%O,<߽'/Jorēc~w=uS|ڶX>|/ p{ѻ|l= ?:I)3/wRyHu *=ȸ:ylCݱŋ7Z@/Z~qY]([ݰoVe;W+y{eoRg,pŴ:ZB)lzq߱>Yc7@OvNuaG7_"^]eۺzEz n_ՈI0;G-㭺pd[:oG6*Gyrhu:HNh  8PԸ]B7( ͎mmߺʘ79RݣntS.Vb뵪=WT#!؝e!(RDz|Z KDR=s:Fѳח/[67O?T`:u}q,Ŝ|Lc)!4/)-污U&2䱩#&huNjTD%TǞݢ2ɶ,F?te=+u8֎1psRM@⚂3ggU[4Red?؅Ef~_8wk?pwW8ujż^֫ kT'm\7 Ad^~Smf5?,SN5'QvT )Agjy!fom!ҔCLFmv$ qߜm.K<6m;d%J$R X_*RȂ뚩be%̹'ټ׃O~L# :M2UHZ6:iQmPU̪x2 w-:ӓ3 X%WUcE$:>%`Uln2ʜ{QX;Re,$S%S3#)tARݺ$a/]{O^[BMoԃGJxڹ̓pw?] 5&>rPI#3 i9Bm7V:s$CӋkj)'NkjZSԷEMAj423ndUa0x(Xf,XxeQn;3޳tsptջoۃ|7'L\w K4^-dc5TQm&)15qdͤoc^ 8:l9@ Pj;9  MM5`f݈]g<@PP;Ψ=3)2Q<]\Y>& &&O8Ȟ L@"  ;Ni-;eW`9pROS7TMa,uP>1@ľFBDyTM6}t9ksQ2O?篎{ե۞p={V[tpw Lg/z+0+.5.Řhm@AcЧ@(pAţJLơA|Z!J3VgzGVG5 HXBf"sw_mC,`qS{Iw^8{s S.iCwV5ZOiWXR_"KضqU^G,3a(W$8A0n*3fN˦ܬ\g:/IE&|jzcM9b '2egpV߻rZ/l^ib7];:sşs_:-=ٚlkqrbb*x~8V*Jt2YK dm 6sJ/Kإ&ƨ܈L-&̹;oYA_,m_c6fmy[͚b_W(Kk+r /W69x[$d+<qhD S+ٖr6{Ezt~cä` p:6Q2&KjVՀM.Zd7Bgyqгzf' ֱGGƬk 7 :--&zlgZt3j:IW#7o> &[FO\I!ТՈx /kcZto Q?.:X?{bcqd.>Vvn~?A|iޞ!/9׊}$I(0*ՔMOLRl%9ftPl!Xۨڂ/?{Wܶ|jjU`Q$[R^jNdlŕ}{@DACp4=ݿN.uI-=ֆ1"<wp@rݒ;1+sݓ Ux.LO#Ήu#>yT <? 8\B683+C6V8t4u3w+A=^l7n!~c zr?O T}]Pm-PEU1aQ)@?2)=HOB+µ<[~$=NkU]V1J)g^cPNp4{/ ͰIafQ gouV_o.{96(Isϙ17\Hxő_>h:r/8͟^K3~WN'_ޫڴ7֯MO>wZ<j|7g.ӫ/^ V6'$ѦW)nv~0>7r (/=㦕#ip&Yٯ ȁOu#^)@Qu}Я%FvC2cYM|47 l4b U08@ע>7\ --H&-ԼSiBllf Sl otD6tTOfEsf s!ʆ{gEp]p:d[yWaھ )~nm^׿o+r45Zb$~#PTIW:Vw~޴a[s$wLR]̮2@]w=~n1r&j^=1}ꖎ?saăy5/t" aYaeo(7#/(yCC F!l0!䔍 90e:*2s*fܯpVp֎b::Kw'$(`@!T9 -S"0 ۃD%t:uӃZ(ڜ-rm|̓z0;˜<ؐӣlnB(=8FFZ i7+ l%<"Q2p96lB[rw]TY|77.tYZ43P:&S h(`Q !8Xrj4!ba5&)g1i`(]s@C+'u2[HjcRɜRafX*ĝVc 3"APfdn'^X1TJ D}܊^/UP, <'0[ NE eB([OU>p㨦L$)+dƱԀ= XOXȨJyB;Ln'6ˬ dau6F KbcO yV1k K6/J\{!Oϩ&j$+aU(Ia@dkߏ5})Mrl\~=,#o+%bȣfsU(x4>%c 9eSDzc;RL]ϔeë^9!x CךJ`pVtb 6I6 v r Uj3 t1:iK$C:zL#nDF}*g؟U)}f.rmګ%L3BڶViN^?+7o+?VϦUpvA]si>8_ǖخ8HՊC'[{b|sOmݐnT{7rbT(4IU3`zi+Aڼd[*^|?tK?YFs596o[o x_'~͏g_aΎ_?`e6(l~<>g ntZw5tMG׌tߧ_+{sY[Э4/KqWMU[Zs^0'eS7Y-*k9tkjVlAQ! Kjuubă$ fƿq;28x v̤A(|NzlC#jONF yhwڥ"s`bF!B3q|Hz-u0*z%i,uNu*3*2{V:ۮybQ5K 4;B(vSQ 0`].1W>g<\ňs$(@@%>Sij%-D)rǞh;p:0Nf)! D+uijU`McVkƘ3ɞFL\Ku5` Vt:AU Cƨr惃^Bg**b)7{rIoOy9d+ l=Y#0ؙ1Tۥ$d*v4c&.&$ȳ'4{APWvrޒ?^VW:ATF;(DE ˌy퉊.z&{L} sw귩J[R`xiapzg5q=7Oc0;oI4qO"iKn"'Jќdsk,QL*CUEX5hV ;ՁjmR( Hy  `xՌÃFGS1&x92J#"( -J)'RHDc3rFץ>!-^Z_>}Aだ5f%x>O}(\?se 土OcAၱkT V0br@A_ #vQTc䈌4-MUZ q/(|,0p) Y)'&ntC7 @@j#ռu>2%OTtDqAJp.H;$Y>]$%)jHE>ג)E>i ?lڵTh_n^PPhw+~aZTR= 3H C*sc~*g"WܲIܿ`*S<`s- '1bePLUX_|n^KY&&VQ2e{3jg>sWL#,8ZIguS&}ݻծٍ֞3.9 g/iköPtF.gC|4{au~1S,E9"Q-zG-I@/d ?/Sw+Zv]QBBP0[% *cN@@Rie%[xsL5 (:o;cƌL"^ˈQbhnliA5QqH(j  6l-B4 Džd`(7{nvͥXOe\1WɬrE-G(Ev4+'G5 <\>珒rb Nҵtv6>ΰ&>sB%%-|-󦪫 ?,*6Gm2f6rTdK~IPO%9O*`o3_7\شGQ ."v 4)UjKY6ڑ< Y 9~E e܂Ցǝ/+*l-Rg\s>4 {|pA =\ ԣAۧ씂b|6yϡ>}âvZy2Ǖխx]N{*gvH'd&>x+JN!%m!g _!ߑ+~_ vZL)n{u?,,*OԵK_ӳ Zi)o3adav`:iWwnr-nD͗ p劺rV)zLr6P'`X<֮_>ܷڵ yڵ凨_8?Ai-HM,VgAkXTF R;#KF j'<陔@*lϿ oGL%d,e+kv,/RȳʱL䑢 ^)<"~v+",ۧski~ѳ}1f*Psoyk[ L15eRͦoT'! 0d=G;XՑ}fP$rMtD7&d/$j2 a e2+ME̓Mn|($ն'Qbl؞bN3~o!)M5$CRքU6A* 3(=M=5"ִ}UE2859I8jU= |?ad(ikT&PV1Nm&N.ڤkPȚ P)3r<Đ ''NDbMf9hQ#^7>|Kd;} iJb}zOGgT'O/k&_~6~ͺ?/dz򷟺{36n?/{O]vNz=dZz}?/u}]%y{nAK'fp(]^fS/۞3ElzUoW.B[@~!uvŽd4\. k+Y p]ajAmS7QϗfGD*)#yL}rzt2`cd,90W}}zɛ7ov2/8H lX Juՠ< cX:jƕrPM|fb$|8=_fYn.dlor@)22#H.FI B03#VX$ڢM1җ,#ƤW9R ] :zL Y|QkS[8*;%3ʁ֋ݩsn)6%8^uN@ {.@[ZJm۽J4.tʲ4Rc6FCQTW"R^yWB((by=V/)lWb8hcdcBSVegBJL( 7Z&5„)_4slNbG6vl胑#ۚ{ѶI۾mͩ ~WA= Bo!hŬӏ5ئgG+}zkɘ_dm\GLI,-1ƛP a.0+TlXI#sL'G10?dIk@e (SL)m m4i𩱡6g01,e|+>S6ӓSB )WAvóJ+o_] 9K AFp iOEr`SI r)$w%]5zhP Fe_ z$ PIaff<ƾЎQ}፪8*2^st~qBY8y~99]pM:&rBs.;aTZvMEJQƐH.k·:pmj뒅h!;vcQh *=v3q{8ksoҎ}n#ݤJJJ&"nD%\]rX#t$)m&ۆcEo$0Cfd(YtU| Ưs1Y$&B#R_vϏ2UǸ/~<#яq7u7IRHclNH6c F}@fho)FIuܘ zI-9&dJɧLZIH*|$~ՑlYr:i"~q5>bF:E+ XMFJMA+S3Ř+2kC />_J;y|ְLQb7яOhg~̓Bj\ RA#fX{MZ,aV%2:W8V u|,CG)0PI1F x-D+1@ Mt2!PmwiEʢ]qF*X]**F?Y'tQwZ;ox=l\hNTfwN3Q']1B'N[C5] H /"?P֔/$@F'G:=n m̔c'RZ+c L6E l.0u!E* Tԙ= SE Eg-RU1N&鳰5&;l @YRv>r7>;>Yef篛ս\v`|H8_Q9[k#lhhHe3S ]S#fPM9"J, %aCʚFtQ^(ȐZfl7Z$طB7j0eMQ֎5ڒ{aNjeYN7ӛe 񖑻>wFOG>c^9tjgsMP7CAG3fyVu:íl0RpH/ )JiC9 }جHfd/~IegGB}a6;OߏzYP+AJtz`aeLTQxvԸJǀPJd |kZ/se/sIfl{q lk䖛bdO,Q+ ͙~8]_cR_07*>&zp(/Q)YBrJ!QTBS$Z3'3("R"B,!4&*'WJP9pIFwYI2c13592UEeK+@~@΋ӳ%0< _8i:BDGEv}0coX%U)E $l$f$wm ] AxHg q#}J\SC'w!#QH1[鳪WU*g̻\s-r%!,r@15n+d7.lQ~~4yWAuWdk"Mc,?9h+v2*.i{ c"n .`4nd-(Y]ws/"/*h}G!:u'PԶf7w C'[KmuU||__KN^G^d)K(zg GI!9:1~w1~H J cg c~]r8| ż לt2,3`8c{x>ϊ0<;ɸa/EȦ?K2fӿt<o¼~&x VG3_{z^lkO14(̛s̜*7)p>>-&f P+7Vt7Z;+fytI7^O slDD To̱ 1 LT*)#醆;NF 3BDv.t]w{`Xqa74bd`s@NeC LstngDҡ\ @a1\_4} ,a2 G1X1xGl|$'NpGB{ڰZ*`kE% t,br4ۢ4]\xˌR'u6a_=xukvpE˙$7\;o"a-eo)ؿ{Kh|e5T ÀtmPXGf+4s-E1MiR2ɃD]T]9ϥ&L4%X!ZFe)A-f8ymp%MF/_ɕ܆>drRʔk MA(׷gnn ԐK:5'sRJfΪʽb "J!dKN&4D,Ƅ# B:!mta$lÃUu+*#R"71dNF0CiƝVc  0B'LܮEʏ%ACH=~{4e3^q@t@[ N !J[#G!٧FQlM0Jq,5`G8xړ`Nѩ E%!eC dau6F KbcO yV1a%=u(dc?, iYgv1@$9p: ț!iI#ֶɔ %fM '9aD?bW(bhQC ?Ah$ e eSDz;RL]fŽ>fs0yn(85ťCR=KUw'l(+t5;5dE,-T/>OĔ%ٗJIﯣs֏rF)l*E蹽JAW| M0T e~[^q[HP&O+cKgUd^_K3՛_-Mpv3:̅}X켚[sa R+*'%C%|wKM͐fTs3&$b\(4i'Oˉ^ 7 NV ZmHZ%{XH xRՋȯ%K ٸoɱRoTM* '@Ûo{~~7՛&jpw_wyp1@gwum5 -4#m}mgnjo-Wf8s;X$:ŒcQ\sr4j&r4m:9;G:Tnttr:םqݡD*z?;k`EKN< H s̕6W1 ,,,x3u,N<Y8鹉" y칎V#hFœ IBt`='4NJW5=֌1g>kiB¿j84&CiAU CƨXʾN>z iªq(Wq'J u ҹ'GֆEo\#0ؙ1XucIR^W"l;2;ffʮy=OI`1sD ؅("8!ukLDRȞT 01o&yL_[rl/ʛt7g:#׈n0xE yܒuJ9&WXʜ s͝˹rAD:s#ؗgaSąNnkg<(t4*"LZ$)F)M|2Ӡc'1D?7$2F׏Qyi"(o4@S*@!W5Qp <*_hBܝџՀ~;Pwt=ŵ[Uv.ʋO?~-h]!r `hn)1H)hjR,>L_Tܨ9 ֻ*b ޒuHgj*Bc(!"`"hadDkז#ryX.ԒjiF8kapڨDrsGo$e{oVIR h@M>}CsVleح408 wI4 &uD9,y/ l;1!06c ,ϳEQnL59; iiq޲jEStC{m(P0vY4!%^ wE%K6F]G+ߧ}{֩>ȗ2ކl4N{.C͡\&/aG<]eկaK)n6-}6h8N}?f{ĪMc;Iieh;Fi HV+8^2]l;8Hd޿#+[ .l@[XRbsqxm= {xh{xSt2pi9"# "rKSV/AV( *ɴFZ7ynof8"{?_<{} X!MO">ͱCG1Pe)ɨ EmI 9Zr;'MH2LR[Ag tFt(4#rĬL [i R8YA|8Rz=fvj){PMBduqZX5.h\ gQXTGd8▥K\g8i vdv6j0ЋAj* 02 @"%2> 'uז|Unm nXr^WJh J7.p_zw]ͪoi1$WN^U^* aAfdPv;Ƒ+ir{E/'UW݂aJSodצ5{)!ȄnX$jy LqY̙C:jXtNcʠ>oV~YCrek/{VΜP.e!S,Z@V7X2LȒ_v[wxtXT8'@[F<|f8y֊9sq3a7amWb6B,n>G7y@kDJDe+4 YMyB:Iz>;8qWbYimҏ K6):/t2M" x!;%6 i_k+nޑݼ&=zq6+g bݣ\޶=V!S_p2!ʥr!I*EHF~>#w]>DЋ#9%cl3zd&~Z螼|c ֩\ܧ> ){&JTg'| ׋HZfMcF,߸RS5I`PvRti i3f6:/.(+Y+Oc*wscb ny3_'-,7%#&WJd]_(ZLq[\>6{U 7yt} z|5]T\ݑeJn^C֞S?0tymo!6bPݚSY?96r׹z7Id}iҚ%;_,{ep[Z>ei0*٣4)(0ZJ">@ Mt2!1PŪf!)U҇WDgK}m^gz!tz5&|!պ9dcsV(zrm ((:mӢ]0zu,?Ex!(,(1)+'P< 6Ŝze\$eԖ C ֱI>^*/%FE-И 6g7zJm}ґ缾 mUqxM/Z FCV[ISꢸhNsJ<鬔TY*0$#J.B%xc0B`,r-Xvvcd|Jz/ ="Rb Q̠s҂Fs@UmQFKzJ*b) mSR88"tF"xΚv+0^Qg)*_ OKx&SE #Ldtk!E Lh:chkE@N/ xf 8@sp)l3Eŀ(Ռj' ~xOz`t%28^."1+E cdH^U6 e&1g4o=L'{fzУ櫌˾HG(=f,%d,e+k1sRȫʱL`r N)wEf`m"",Tj|Vaȅ)v,lA%+]"5D#L6Yyhgk2>m} LC:E8j<뇬OQ?vƴHԼUBlB&w͓M _WɄ^oF9jۑ( ؎8[iCJJ􆣦90CJʆ&TN/(JFGEhGfZֲlUQX@FՔ&Q| EQE|CG)KQ؍wNvF lSӠY6$f(f1d tdT)BXY4[Tzӓn-ّbu /K¨K7?"VFtş_Qa:\_Fw^z?l4]dyJoU6z0_|1כ_Fo޽y3z~Ii&7s~D't`[(uV~߇/^Kq c߰~>^g;?eHN0nW!uu<=ueF*|W^\>oi>ڷj?i1ś}^-Y'2KFOeq^y J*QO nEY^|j6N,; Q%ei߯?fyrOt\_o;}<)XؕdMMz׏t+'V q|Xn1z;i8ZҶf.?7ɔӵ@앥)23#HAI icG`Eb$39+-#ƤSQ"SQOLY\pQkS[aP2eruб8{zg2_r&ll.;a24Y!eHI5"v@yƂ~3jMm1Y2*o'/ bduV@jVV1Y,2/JBE.O[ilj> l݂~C5]k^N~UuJߖ7d?PByv=H2_{c韑4I-ڧ *طv@N7v)RP&I?gP+螾S<IgPs, f u#t3)CIkLp/,77|Sjx`Tveq/Y/W4Ev8byrGa:FXQJr"Z'F\NK{TE"t>ƭDX!BSksxXr4AIBI6\tr-"z'B2l߄˹)0HmbTD QgD Vv%Fir~!nL094{6$zb|z;g&>aNl=vTk,_KMo7|mGrIABZT0P05iYrERR z)PJxfW=g@W= ٴ0U 9aYyx=՝4ͧiNFk_a'i {oƳltfRrĨ>Gؑ|WOӷ l]<*?~ʹ/0YC5?CKdZ48YƱxYɹ2QQAYEp ?WdT|g<ǼtUǭQ<^]I_F0É4'X W0*~٨ )zY:g[?Mry^z]v2^k8B ~4Wq[dѕ}d |Y:O^>IJΚW5ف? WMu[[ںt6A~ЛZک{!G{hSe8g&V̹xGF]Ӧ\ڊgDyq2Zv,d&Br*: !z*: \a\~_oߺsUy^Ӕ AE;R]s p]kg']GX^<銹]#A5wٗ8S: ZK@9VaԄMa\UY <{)v 3j ݭַͪ-}^npO&|f[]ۆ>#Yݢ9 բW<"[9@S4^.q⿟曉7.a ?:zeUPjas)z!B9_Yvt0U2Wt PX>G?ϣlgj}z>{ͥ\e8Iɱ$\ *)TT KaJ"X 凃q?@^^VV!̨w[&QɌF֔Kz B!Z$0\Tor#]+6A]mC^?3Fg\~lRjh]8Cʌ`ףi5SHV(+Es(u\rd\A$"}$Q1&c5NŦH@ }+rVqrMH]+tKZǾ~{Xa0R QymQ3!l៰<$oϙPEk跌MTbYRep 8|.}m3 QP(e|L]Ʀ;q't:o/}5ؖ7mS4I|"R)JDe-QܞW?؃ cPu)4MQ$MNLG4&*?<t)a@AO x.Dap ],􀀸WBDJιGXȺl_:T;L= ݋5e]qz'!mOUv@ :n %r=hSFEL1Vz#0b9u>׻rl?ܗgp&?&4X=Xb&ߢFP:*E ~m׫oj/+dxIyNIݶ72;'yɋ6ڂo-kK^2]Wz"ŨwuR8IQ8iX R .Tڀ溓kEԇ0.R;=ҍzgr?6EKѠ@6^s"'"eƑ[~9 Ksv&763s A2壶4?C=zOm86reĭiy&PJ_LW]'>Dٹ| |;^؍*Mwٷ4&o' f !.HY!Q-e )H4V&$sq-Z ).iLlR0-NPD>+5RP;brǚr(T;& s|>^pvͣkgՊͨ554|o:W*'b8ȯȱS&oT!Mt6Eΐ'_?ꛯ_oN eO^|j ,!m)pS&!kpm=C۾Cs#0`}>uwW^g8qkv6f saw_T"("GlК!J E\l~T/vq8m;+k*ZfYVd.{IAc݄$/ trRDQJ;xd F1e+5x6t)=@Q% F| !wʦ`u BIF98X+Ho 3 DNs,7(M4qL&d'zfަs_ YT+)kɠ3Jx;D PXd!e_}<{9p_P]ɔ9HI:K ':ǔ>hr2"2# J4lzS:"Cs}" .w{fp=cNKDߕFӞs^(yr22RMd޹;ٌ<ܞs,s&9F*(kD" .}IJZsQdОY.ǔSLko5޶L,lPc;z7dF%6*}*j/5W`:"M!@"_ JrZ]>0`/ xP qF uz"K] >n~QyGsr̤J Q:fI5=LyᆢuIβ\ i$I)&AnUA8~ U,XB 9uF%3xC-@rR!Db:QD.%,wVz={۫pbV&5Q|5J[`$ǃ@D™ȍ! !dDHk~\Ū[P=OgcT: UϢWj1?^ٻ8$+ =_؝Fa/30JH6lb}OTwMME†-:232✬</:}+|%`+^;MT;FZt#<iУkv>Ȋ1e?˳b‹+r(Sggs)њ'~P 'OttQi0k{F~w@nur4`chĿ\#/fR槳vDqeIW!뎏ޚ]F4jDf)֚gwݻ<>{fֻ?vW~䫛x۫;tͧw\.>|(Wa}=^%ՎV s際onF+W|jqޯPwhm{WھJQZiیQ;?=(P>α&B8wyS=$5╷Zy{b LOō7[Q~v-W|Y5Ԫ9V]8_;u\r}y 5qIR*8k*+[):ڔ0:ܪ5T΍ u?=\ um;dxzsB=wG^s 7N*Mw+J#>`7z}7/V9-o^+Kgs_hy!!/gga}Zg|o]6|U'_vOϮ)ņ ]?.~3&ן2ljb㳍x.VڌUi𕦐7N}\W'ˏTr >y}裌FN6=~+jJ^JOHh1"] tp6emIn ֑mRR"\\.A+{44gm@K հfPӮx(#W|T'~]ᗃVQVצr{(wvDg{j"}+nF6td{I|B}PT1/H1׳=eb>MA@_dsk]}f=9ᷫU9,_qrlICIZ+z.蚬*KqZ&v8]HG!Zw  kar駗 \qWZqึ?7TOew5zƋժ.[Fڨ4nU}T9e#CӢ%PHUhO9WUVhUEu_S-hξkHovV[!"\U,ޠM*uҚV ڜ :DMZk`ܜqJ̭֢kQ/u<:8ޭ-5+|BJMU6,ym&5!zp- wOPd0v7cv[\D\T5i)KNI8_k0aGK8F}yuZhcVBP; ۽mGh4IUxPIɜ; F`ߥ{L.\T{}2T8[_:GBF?=.Fl3h0v"Tj'=u(sO5?/B/Ƨ\{ܜ4AҬ*ҋqT~9s!$+F[tƅ ףI Y0a_c[%c3(mGZT<ԴIE[RHHH I?A ҋHSFPK|s;G[|NYbѥ`D)Ȃ>4Jt)*k_c37%}^$\ @t=+ )* x{jڐ]jy;"4#H2S _26XAX:y'`JD'Nka`:.u`;4V( IT@yNYϮRW ڢY cMpm̭.tt%O8BYlhhXWoT\s [(#K!L 둔YlXkYv̡  <ٻVLT) q Fԓ*eX<A1?]:'ljUSAJJ#LeFŀv4іh0Lqs;[:@=/-V !-9pYV Zd BX'  rNc5BȂA m3Ib7uӝ1(QdFt[3 褱#n n!6Kco0f|)̚` ,QLk(9oL:v_#*A65TtgBQ"!\`#)#M VWf֞5#xGx)(`~G_!@OM!Jۑ]Tj FgE]%A/[.Ξ4f^n!GAU3_Mr^KY&!Bi5 YV+ 1^ B9Vczh2&M།{ܠh^KQEn"#Ƙ͋BIJ!ĄmBeUs7.;rxqvNcڥpKTе3[ȻFT6=DD-$ >:%Fy:Ҫ$ >Mt%Ch|pL !'ic2 ЃFF<@$9d^U0@6!+2]6xѼq{KH'*@P#@܎c.cHp#+KP-W1hFym2IރTx*Uay'K =`Ye>ZbM=A!%xBtnbVnD%| \:$ճR"Pc*HcJY%ro3h ΊmB[viXp\ @(Т pd..h]icm\g 0SZ(ͮJPƁgUkUƂy0+ !.}p$,RH'J J a0V =li`F+fVo- Bti[ q/G Jw ati/NۻuВ0LЋ![[`M.O1ڹ;<_m^-at9;ڟ`y.c#f:Gw@7ۭID3 .-zLa<;-,F ݚwj}@ΓDk7vS^&h؈-g' I x آsE6* t#bvzI,WR S :( ,3UbFz De;`}f="Ö+Q|oZyE!8ij)ڥOn I"g0Eʟu7/V0 .LB)E#Hmr3X2'Yrn!X7ϦrшUq4D2 J. zY 4IBЂH2bNӡ Kϝq$,X )uޮhx蝩yD]pB P GV//|K!\UuB\BH]*8x@o>$h ZeZ_kƽt= UD|J hJ z3R(̕(Z@dѬ@F;@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *yNJ @Zl@a%mb%r@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JW AnsRrK|@p$z@2ZVJ L&J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%89)h۔~>J 럏O&T@ߍQ ,p+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@O\ P E[]nŋ_i)z^߶7ׯ7]Zg}8>s^7WrXui!٧(Y㚗u$9^d\חj}8n,\_Эo^\MۍKՃrP^m|w|"<1Gn!F#3ԎwmIgp7b;78B?%rl`U _RCtƎx]]/dz Og ԉe03Ҹxa>&旲28S2h/wSKE7C*L^|B3?_2.WYh~tȊJrQ6$,\I);;_÷;kHE>Ӓ)E"&+ԭV|^2,`3k`z2!@A˚]km: vc1EVy ` 9P)guC0R QnЊ0vh/+iKL~@py1^"+Ig\55/)G~N& ݜz3Trf[<ɘC\_(o\~# )`.Ilط$!uKpd/"1^\֬3ڑFiC0A '/&>?~#`gZRKu;EꉽSv&^"La2C+m C6ىqgh^^sG5Y+GpЙ{ɦb,}{[u*-.m/ŝG2-彸LTT/Ѹcb :FWr7UZg%SVUU[WyKaLBl8w 0ЧY~+:Ň0;9طRvpd3k,PV8d^biOc_H/Г#h _]+ge[ރHٰ3*^k`"_H2!wb6; תU,|&l|3 `"ցjm CA=aHo:kU(ļQ 8sxqht<Ĭ$'T>_tZmRz V]KTAsL[h pyzt0q<݇S߫"v.dwvC:,|, Mߍ9ϛKwnmo4|6n6UuDN£a6yp͜,%}`39< /ѶC=[ e %xXsH6@~>\"}/F@Ml(Q`j+O]o6*S%׽6 u?+Z3WsFbm$ Q10вXx7FnZFp#s +݅Qo %5 4O_zo$ Dme}ɛ>ƟF_A\mn:,!}?`n̟?Ikةx⇟嫸mZQʪW:Ғ!:Z Rpg+,)L1u%EZՃZdlB.D%wISͩی[_#ҒTx}LZJS4S% ,Cmi[8bhS&L<&ʠIV]emmkU9]e8pNd2P)0!LoИlVwtMo5lYk@,P5=SJ]nQ Bq*4 1JƄ# q0VDXsuqB@po('+U9źXTQBWj `mTkܤڮ (՞uĭR҉I?EtZB"Z`RP>zCP7`= a# J* [H.:j w Z+,9=uC)ɣ^NU*9 盜ޓ< ]GXx=uf_gg/] $%-rF߆io? 1L$1=TT0.u=_ow KW3ȣ}0lRmm0S0E]ݦB +?_SPgսJ`Mɵ! #I6 v rg)\ۛ$EaTa) |66 UbCރ)6j`fj؉ꥌ jW5z |h sIRP&3wBf aComLӽ^n;7ΦwՅw919 1[s Q8wwfa}Iڞ}]7Uͪh\Q0i&/qmmu>Ⱥ^k*E,z͓;HC^K}l- F<܁8嗿_6|?o`nϻ߿ip^L/#%s}]붺]StH&I!|Y[ЭCoG܍wtXzÖ,kt bFٰxAsUU=J2b Q!uǘEtͩfpVm6:NbGucOn'YfGbRa3o\`Z5D;f>ov8̷X=_0sYX& Sk-[ީmsRpBof#`Z`T K%FYsVu0=P|k[5&kkGZ_T\9/;(Tӝ*pʡةe͖-a%PLJC8'N!<`Us E ɫIFKA< D 3̕6S1 ,,,x.!^3u.A<s1E.:sQpG.Ɖь %79Bh^{A<ݴ;)=@LQD8[&ٳI*qqLa^JVJaCNX|pKLSVE#FyR;mT9..<pXam^`1΄DK;H2-)eI-c,I(] ͙dg ] "b4ɬ^;Xɔ1O_+/}g ?'ӳq% JqK!㶹g:isQ-ܽqwf2;2S5&?@O㖼sVbi4)mP:ܹ+4 O4s[Fw@Z9 A\Hh[#<&8>(FɥQud"%1" R*<#X`T9F$㑅H>HДS-I +Cʘt5rSOԯ΂l3h2x([y$T^Z.[-`//Ȕ2r`nM4Cqs"y&wa~?)~:N(tup;1>`M|4>fa1Dôe!~F狨<:~`Ǝ7e|1m Sy`'Ϸٽ cHqT 'ܙiXAҀE / $HWZb9]MGrVeÅQ6~e+꼿~pO'p@tii:c0r{*r |9[sگɹ:d ƦqyAUp;YU||vA ^9N=[+5-a+vJ{JZ>[$L2k7Z`ū[b .Ɵʧ79w4xfwY48ϋSIkfMﳖ0_Z˫|d1\_>|pI(byNicp]iӲmeYehHggZI E껺\V ݑrD˟"/3r&et*Iz/&\YJD`Nx샳:Ő %%c88/L ɉV@i6(҄HXZ,3BR3mfh=2Vj~D 9yA3²*憒R[Ib'楊/ig@|)4 2Gث.lu3PI-k+(H)*R,r@K\ۍ:L:=ṝ-s_f^53:m|ȬXAkad)f)EF:| %̙@ Ghd"l<;\cpC^\)pE|+6p HM)Ā2M` _xtBRl}% ^f+m IiMCR'$p I%$%ŤHUOʄ&\t2XY#RL7,*Q5$O aUѐ}k#SAJR ė3 UrrR1Om6yZd $f(2f1$huUԔ0Pjyf./L"n=au3^&\IMy??V]0/vCZ̗ t?ݿuiޭ9/:Y_Oݥ~86󺃷u&gLMV|#N'҂Imף͹a/ox;/ذ ?HKJWW@ͽߨJ}!uu<*NJBػRNF+J\luc݇y^uP t/'*RoiNXuwdNpem^#y4)'lÌr%\֭ϗLgYaɿM@8lNA_~Tf͗)a7{?> n C6Yi7_ňl2Mf{wx:_8N!BÅ8JaXi~>\YvW腧lr` fpr7M0P^L<}f3mu̘+RKll,86_:J.8??Ov7T=2N_\:Ml;T!e|1nT,,|mR?]o3ID5A>xPbp:PdflhRd/9qh0EcALNhJBa"it !gS֨BS\kt #̠$f9c< }J7߹"d]#DN^iMPxtٽw63kQ{(u-_PAmPvwP"uGIisTFV $bN)gKūB$kEׁk0g|푏 D6Jg%dPKQShL})}lj裙75};N~'6U7حiy4! iCM%UOO~[4ئsd–>B1>59 zZ] \W' e,GE":!-I2e7P!]YcAڶIu»{NLJ6nr ~t|*9i4ƴiH)U j2 |lwyJxa(25*HR(k9&bb,nj: i)kE!<* oj*)枡O׌E a!mɱ[H=\h e(/T"*$x5H$%DQRR1drЦJ!'W `t!l:#c; Yc6B3b7+ngߒ9ꜫ+ڶoNXd~初ulfgsIkmYT"(Ĭ TƬɚpm[;]7*Ⱥƈl:#v%7@c_Pێ=2ػ,^"Ӓɫ"fYEl}TlnFҤ,R󂆹Qt(23C IgX_GFe0q5~U &A, S$M~<D}]NR/>1"GD;~SGᕯH |d,%kO2WXьa HL77EDXD 3$Vd}q&HFZ+iV+jƈl:#⧓̷:2.vmYg3+y$.j#.i`tF)lVX+H!DPA. =C2z௪]ApcD?A4[ 7%O";?mS6uֺIޗ~Te u1GuTKC.[ci ($F@ e4˧uBe 4O)mF *ewQQ1ZWYc.Sϝ+G[K}TLw45\3By X'@9o #|&)QhN1v>⷗X]Y-Z& |K5H{-B}@P3G:=!v MLs(%(fWABVF(!% )`UIv[Ӗ*PQ'F(,NkPH 2,]F؄r}Ofy`2~It 7T߲r>X7/e&ه_nFJ]\ RDzx*RH#KL @ڦN̸P"LJ*KfV+dG`P#Q(r 2Pkl:;طBw8e-Qx_: v!O:C^,P}n3iWU;?/-f\9y t>jVr%F{(1KG+fph㦪ܨ\G:nIBOtdБ6YRq <93SVGڕ8uo'ǟ=~@w}ЕN)pן/O'[,E(r* e 4%=0a Ls.D% D?oxd)Kei)Wr27"(S!:Ťty+|.U8bkԖwݬ)6xQY>j,(f0Y71xz'HAdmVd{t%*%K1JQBr_%XZ\P|&RX"!"=}Uk2m3WJP>IWk0D$Xc)Xb,L`<*ѥuJO@ (t#~ϵ-Ӝ>uV-ɣ2'"9e #}{|ۑ>_5B>Up)ձ-/bFAEr=@ޣ7=%t HZOz~4z[ f>M'l [o8^MSAwc|}a67]'%/'z" 虓:$I A dV܉l-KZ[QJy>ϰ(cdL'b;nuqOh 70@EF%$0;F\K'ZrG-oxPHQ&gaI8@X,& 1I,8"sK43y:UuGϸ{m\-3 ^Zyv."(ⅴ9`n/BvEfm$@:륰J$7Bg\zny^8/T$,~Fy" x4I pѪ?[b@*#JbVpywi|0rp=3rxGi]jΘb)Jc!IiT|Fo{q{~p֫}(hA*oR_EZ+I|b_1F|Wޞ!@/W)Em}N>YS C5ԓQ)jIƿSø}RX>1^ l)ܝr1_o{?YK=Į$㏏ɇ8_%i1~Iu}iZ]caq|o8T@]?O+>D͆=U/݄_fu^K^*_w4 ѽ?;wq )۬SkjkώU$6O{>Meޜ 7?Qt@p>.q9wE3#͓ \٧F+,?]Gzldy俏:M˷?,'{@/`KGm5|3Ldi}6A:?;>\-٦2YDg6vm:/kMpȞ \x&mN#1@wkN ԛTk7 㯈WSEVwWMI'j6-rREVᢧ`Y&m 2sқ$flsT>\eR{>:)u:lT)cVG4:/'/e:yۮ_K:קm6͎ۯqDzrKm+V*wz=ɝ[uf,Nx7;?~uf8jWAhm?9lA7{s촴8bӓFv^r=!$^pV-?J_a[6l'-6a4=2u{\4'օ`ͫ/fn`lr{/_ftnYg3\۔s3ʺHu;SONGZ>m5=Я':;csu< SEҳN)={IurͰZI흺S9EޢpzfMrޤsSDg+Kǵ[S//7U]'[YrF6EyK*IXBGy(o\_xg:+VyXmh/<|~2\ x6Og釻f PwhiazNmJ˶ZmzDYeM.VÆ1O4- }*n.Dn]|]q,[$#|:JӰ0Dã#zcdlb6c뼷M(cۀ'ѭ^<w;JGGV474zrv>f%+_[i:/WottXvheA=m%r0tu'PBXgG@BH/J{c| #Y]➭fWecÛqIUYn".Hέa.V[4H>LAB׼ZO|f\2+59(H39L$ ItMVV8-MgBȤ a8 t3wr݇OO[}h6?vǵ5$zE(Ѵ3pF0^䘬VLv$6F%6^wfUN@bM(SZ$:KJ[rTh/VAnog>}'XcK4^JqycqϗZkbB㦴ַZ B2VK:ׂΈҘ&^FߥUS)|VnZ4m1j\ZcԻfVH)H6omҤ&D1 pta7ۚPE0RN›1-.mt.M)b/9%|> x3hz"ՑWáuJڢ҆^{( oR!TR2玮B5&xB. :ރe8;E{ d L҈"[&][.>"Uoڋ",s.lbqM b|jśQ$EU^܍S5ԺV"ωRRmu ;$ףI Yw0VG[%c2Nq eFPK|S+\.j()F"Q+Q08dms$T.WPXz'餞.]FT,hCD ɞ5))idw"R5V&mY)dWt;&A եfQ#2L3RQ/3}{ck\4ȔSk|FmxJtv (*6 :ـ-wX q99GDnDM2U _.JM$Fg-K75[]J38@I!.4A44kݡ+i\ўLs yYP`0XpShjXa=Ҵ>P k99#E(tջVLt&J36z?]8'Th+5+02*[@ \\idX$+$57t4V2P 8%;J(&J +`WV"v\ՠ6wVrEɸzkR X `PgEHc7uӝ1(QdFt[sV tX,4LbDb0ciƗI1ά :Ÿr@Vʤc@8"+4:}V J8 9r1po&mYKQ@;hU:9 ~j2WebEM]Vkx5{RPRp y=իf!Ѿ&伖( BD jJc`V.2@b0۽@VQ=j+`"Кe8M2yqrJ㗃ukwᗢH#fD' G1!EIJ!䄈6!^eUs3v1ƶ8?MZ|ou/B&>H`: D: xsHu@Ky@xZg YiT2Mt%B*#``!&SAyCs ~XA̤ƂBgs՜P$rDMWD&deZP>6xOy |Ù9ȤuՅ8:q[l>X:?)j@J4s:n|3怯O.N&!z*`YeD &PG]JZhAjx1) 7DD.c@{ƂemD e@W PJNZ%l-ѧFk.IŽn:L>^B ;f!3ڄ@޲N+}Ƃ@ @Ie#kvpqAcڏj:& u0+12~R*qwq}õƂy0) !.}p$,J@'J sAnW+itҖEs64IxFDj3+j4VEC,¯ONX@hP& %l@M}%U Sa![[TnF||]..:b&*ԭGw@7ۭId3 .-zLaW ~)w%[Y:64k*ZSR@ΓDk7vSZ~56"}Ia' A*WRw]PzԠ*1J[$=|"2@z+`}V="Ö+Y|uE8ij)ڥOn Q"gO:|Wt&ePhQLj0R܌"2!0xPuJc-и'`ҕȪT~19Mk.,8RH 5i*M3| ʺyha3\Bv Zx:-Q v!b&l z;k* Y[0m@IVTFZ6Z4`ԋ@+y!1̟f`J$nÌ G9>X2'IG]n!X7Ϧ2r7pid" J.YxY 4Ig%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUdܷcZ%PQǣF>%+`c%רX J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@z@.7B_ޫgZ6_A˵ojUC"){SĮVݕ 5E*%ۛ=> 1q\E9gz5Ry~~~ȝST(܌g-@؃MHpdG t\dO]˙qq1~15feFc15ūiT1Lߏ ü:X|_C 7SJC a^ ^\g̣Z0ExP}pϷQax} |qtΪgD#V렴bIpЬYR-XĞG~T=fk)/>)Rs5'z'nj7wfvZ*) 8Y9RN);7ap,6,>JjP3L}AMFTjAI}i{d6? )ֆGq9-yi=1ʄH'`$oHZ-g ɹY<UCFQhY+E.*S2T_MnxYQ ޼qޟ&%)RY 2{4K]Ps)g¥s'<܎^r¥etv0ŊKp1Y1).%*a䅄[rUyAɻl>rE6Xv 24T#Y<:"X>n2l3ínu!\N/reJ%Ł=*Ըs[%~8j*"@7[%^<4l7rKіnCv{Fwoq@GCrnE8 Z/,nD7r@AIw-m K(iK~lVrvqgfW_L n2X_v/0ҽ񲼦"@oM8imG.LIG(rqw7 ss}a 4u:^e.iFQ1V#)Dk1^-xp!!㔜,oSvЙFe'(=d:V7.(8q#wlau;8z3}gToGWG`Rq.D}_ms5m*U1(Ղp"米k`|pEJΠJkD>)wNOIF6-ݖJ¼r7?jNI)z<-(i37VD_]x@/y6 ;U6Px+ZEͦHn)4Ix61۴'@V!]?l[r_W>Tu. k&ڴ %}-mR$s'@K1\ k25ٔmZ+=_. Ci$&٪AHn̦؁:֔ﺕ`& L9v|'pQ=b){V{ @"[0dF6 i9[6 '\ՙdhVNy~Ia >=χ? oZ%fji9[c[C^=5s td r0K."IOv̕U CE{]/ƲgyItߺ_%z!+Q@ܼT$D[uc)=C%a.Q>hsk{6v?/n90w L~ީÒȽwɰz ߸ݺ$dު=v1]wBfs$uQaSVV݁W8#}6a\x 0*:c`gR[m$e緾j￷dp /x>p홪?D' ,E6XInQ^B΋L`UlkE-=~}G\?^䢤qvjʑN8E)XnNŃcǰ먜NI/br4A\oN&p-Nݦ>>39̘}u  uu>7OC{ġ0e FvRLfn^O]ܦfAe' Z9q5ЍO&/~7Y9&+PNx5qx:Wq]I:Uz} 2 ( Nh sϳH t~r.yCB "W9I`[} M2E;47Z+ UfUB)h/lbqnK} Kݿ z 34ڊ϶DL+B% sU8 )Ş WBi 0Hqҫ}%(͸~Fܖ 8}8"z8Ul Xƞ`)>%B,TQ 4?7+Ɗ՛yJ6Ż)ǛE`}W>Fr&F ąąąE\ 1Rkq$E9)i^yN4'B9bi A9D&na9s/џQ}$ٔ ]tյb0uk 3٥iե}}xUHL&e1TZf.]ia@2YbPƊ<ȑm#Wku[\箘7"y~ phevt>/ͺv$`/V\nYcuSN;Q^ܤ4,z}t@f0R(/d ʟͺ2e2N\; .% C&^`⿴20GFˠ1x^|B`w{ N:Ged wqfHJi[$՟p/XͶkrp5p,񫽐"c"l.pZC4lSV6cFnlIEʣ:VvE_6]>'4%tl{ڗ2·Ls:ĺR.I%U/N\L-ܐ y^pՓ)uE`tۚS ^S~(G df]C2H;,)VGkiGDp$ /l9KhbJzCYB[ǝ9/8UHK3 dcYrŬ97emr\AT3ޜQ=z膶:n}bF){c9w+l\l)Ѧft7-J iUOd3欞«#R "E ᜖pp#9tUeED> @!${?K\;MƇ!^3rgN\yGQl*U6#MH$vz_^` ޜ>KNKk)^a^(fO Y=+è݊;+ᖙqj 4 /:yI! Ÿ WNǻ!x#X{|`CVc-c1+%OJn-c %TBfm"$7G^ٛ^1jS~Z6^(sx&2b.C^LCK ݶY[P}Y"?V-#q1Gc܇NF+ԉ^ >zcy!Ee {޿a-`$=Id%&F8ЈTҔTӢ 9!*Y.`|d)#0X# 64x xy6pHVn{-c&q烢xw~󒋡n -kaunzi@zC^n1 vAoX h]K›۬-c" E^+k_'&sdtHͥuܽ?:oDp4+D*Q6C(Tы7 S֣=Njv8K֭O`W+3{fyJ!TD`=< `T"ڏ'oyqƫ `H7̜%nQEQ {FlȻx^.4NTHp5Y?!vTݍD 8G !!tbD0B#rE贞w;xcxQĴAּ]>Y;'m+.(ahrEvFg1GC3Ftx Jo }IhݪfDa5&%=Lj]Ys8+zceGElz~N#e1WLJllXJM";/5B'AInc-l<+ fƉ(Qf+]0e6c}B5[YS>[pXTr碚(TQ#{(0Ze3*]9`yj̢XlZ'Np 4,'d:lf8{㈉*ͳ[7hy.s-. >=;EOOa;,6hyG3l0ȰCǢ^5S*Vů57CVkۣPao~- d2:T3͠xaEwFX.5ªv ">̀ H0ES}w@VV9}4êUQЪjt:0^J=! ڗ/Qi3H,4|O:̒y8]J7vߡE^,`T2DT٫Igò8#dvBVk0Bə@sAI> EBl.*9R'NE`v.짹`O'ر|`~ڸnN$N]}hБ aB XnRO#&fs5o }-hӃ@$G"b+,xԥT7~ ʼn;86 um]O](`B<8r^)(q &@Q]܁̘=7x˯|Ln?4iwFfzRV)eǘ%—b޽c$.kDB-毿rl $yWzgy 4M9Zduj"T@ylE(l gs%e(7ʲqCh)(bZnvhcTwbF{OD0pJdUPLPɲtP{Fۖzŭ&~g#n%ej2׏^[ }݁B0,mنF%&ffDo^F{Q2μVx@yiHskk Rju@<%@φEosEg<]Y:\Ke)WLCˠ].lաk!pmDyiV0bqՃ_v>2cF q*iOcl-ޡr"/x`b7PNX>ff"*GL`p-N6 wK36pXYeѿO#0䢧BW%SLmGv˛ }[p+Z4+!< 0FۈIfo,ܖ*UT1JRT(0o>A0]s{Bă ٱ.ה5v?:Wj|m;K&, DC#9E!UsW=/`٭㺗^?ǟ% :ɓ1f󟏦 =T&Ѐf inv42B”%+[ltys#R.?n2]Lf Js@1fS)*>|Ɩs>g.oYDJzԷt2q=0(>vnd6]* ~YgQ'[9Iw=1t>Ccr'׌ p'C}yլ^P ^?R[k~>/?)c^Xܡ)F'8N}V )uPpaυ)7wİw\Dd{9w|nCVbҦf#hJ~e)@=sgG){e1J$#|v;g;w04R٣"\jŴbv?:g\~ᏖxP1}0aHHRYH޽X]2qzP% X72m0ԭUՒsYѫH;ES[羁]wa<Y0^pP6rv2m橱E wœ.IY %ɢJ/JQrGYt$NI\{_v3`8M=! [ou)=o5 2A"1q^Tř@u$W$wJYbPe-߽~9iYD+neHVh/!xVыY4?VEKc5' HvT~UFk!(() B 28[{7TҌ3. }~w97~ҧߊ!Xi|yc7+LL0 OŃ =%=ԇ@Y՘HSE*Sd.hIU#TF!i0^>0=U#UJ0:4ui@C(UZY51 5BeF2~]F7χio>M4>+f¾! 2u+yӦ4g"II֒%!e.Nz|_ ~[ʑ^*_@@+oGrXC78OobPH8F߮m蚭xvwg$= KP"`Z`BKEH6]>G?bHh0VawH*ر55^kI]fqFxXyũ8+fHA \MJ_+1U03#866DzI~՜4$`SVhOrex2d9#dl<ՏA2@W90K>Q(8$"so8I 4j^w A%nbO)=^<^h!Vz\2@n\@Co707_3&T>/_3MsMjya*,,SrQ̪N_uE30Ї@D;jP  7uƔVz`P CV #XN*ASE qP7=h;SLz$t8DK(m-*rx A]on瀫BPZ/tGCW{}>)T?Z~yDpuWm_O(BK(PC e(ʄ!!sw_BL|ӱ t/(Uu/5֌lF.3:y58!@m Qqi8F^t0Ya=ETs`Eq>Ww~(~ '*cZ_]O{Bd,OI҂&BL3Ɛ_5ū<( Jj5ryMc#Wyah jB 00T@lbQpK(5͡8fN8kPfld<0/)jb<%9hyNJsRʄ30e.j nΏq؈$kUuD8\vHKu2O8ē*H]\u`s s ND#U"AZHSγqpVHxZ|rye#H,·3u!HM*HLJ+34-3+fpd30H)#;LuxAk֮3ĂM4%`Tpw )>D@DKIpϑ:Zs{qBh~2AalFcNqyġ+Wy2Ľ]%H3gי;SThze Jw|| hL5!?H(3ɤf!ZhjOyŚ[LuYDm-;mU~ů#Cxq\q>h}4,Quzvp)L2@2(M &p\05.\@1TJ4 mJq~|`:`4Du+OnM5stKNg2[9wt?l*>㞞9hbXCa0{u[gSuE`[iS'OvQ>2(Hy J348:IVHfE6n2Q\?ڌtyC##|=p42Y(8[QEZ$M"yI$0t̟,dV6Aog%upͨ?e9Ͼ̞ ̳+; a› Uw0&|, 1#)WKSrjy8.l4͖ V9c qe«D*8 =׮w݀@{̥Lh͌e6Dzܧ*u@xT^0xCECaM kxָؕq^ҙd5kfQz&GLT`x-_n wJ'&ҚjJ^*t2r3`9l(V_$jOZ8BΞt[n< p47+I_J9#1ZK> YS&<ޚ$PٴM<+ye#o7Yَc/,`|3oIq./ˇ<,!"%j.z۝ (hډӑqlѵK! G7w_P)D?cOVh/xyʎ0 c;.=E'd8\jNÛ_nS)ǚ㴝NB(xL~q} ||^]+k_w6ƒ,Njߏ%VK[nI*Xlf,EHX| S{-yu Ǵ"w0R>yro7iŽFyxX<Lq}$%N4ȃ9:| }mZ$v'&Tj.QLgY#qLgXGхzkA[Ihy嚟ͭ!k!5jn>;J)k?WZhK-ѽIIƕAFj DɃ^zX]y( ]n:MqW\ *-k57SV6FO$>%eƸxhp |O_mMrfԔm:yQͥUf]Et[&fG:,dU#WFLmS# C559k LɩKo>(O?~q"t,q9oE|7Eʋ8ǺdJ=0]^{\ͫa~aЌ\֞XS*@^QQ^^.X%]xpyJLT.gm"ȶX)@gC;ȽrJݝ@%.^z,eQuZcp:"jT`-hhP6԰ioXc #)) 46(hr[R\y9OSٛAiYV@ީ^RS[/0M5IJE)>kXN7VTS^-<K2̀- >z#PLY>\ӕӂʝk?Ke_smSXqZ>VxRѭdF.-6-xoV~G_ juUׇ!V#C+?@h4],ZMQr,+OI'j:6 ydVwSljZJ-i>}|S 'cq])E F F55 ކZ.!?P6,d$N9gȄꝅauP2FX P~U nuɅNdK^h7}e7u>Ma2ADO-Ykj#n򒼢!^"?AN1 Y -eXw&pXNcs Q((oLpqx7]OR :֫oD2LQ6^~ Sp^nԡR/3 q)#t:k}j"eU6R+LwI|:o]ZCI' Lt{x5 gfo\PN)t42V\$OK>ә+dĺ=P! a$oLf(hb;UsZD/D&$>a O?4]N"+l4(#L) M<ťYOQkZ0neѩ[3N\*}J8Z O r+9Fk$lgnm/Ù@8\>{p />WiHf~ [-*<*!DWzwGHv,vE2. LITdBtt+7! 04Q`&8<.L 05Շ`b6J١GTV0 L2pxp@M)YsÞµk^qg߾VJ޿8Tg@h]!㪘Pnz+ fA - zfB{oF5;F7488봳͹ X2`-F{q:1=88wWG$%mJ'sqiO"I E.n`4@G7#O$[IN i+vE.YZ @S~Camc$"<1zv}m>=0ϛq@K#(\\O4ЍӼ#׎ʃ 2a7G)mC板;'9Zh7'etkFS `V(hsĩ[ 7>jo#(iz ּ).\Du\+R^!sȠ[>E2WD5Ua) aBb>t@HdB塶:b.|ZkO3б6^F|YLC`5`iwO;FH=4lR%cTނG+iVGŸJ>v_Wyn_gƃg2'ƍBLg]> AFfC/d:f?bH?W|_~|5Md 3'33%5tN]8W.&,0/(Ů'M߆.?4_4!#tQf;*{<qcuǂk 2  v dmK[%2rn5 А콗[yˏ/7I<&`#;8ndR|C=MTxe% k,҆FHXGjsSҮhd~Z Zj5́| O$ߥ|W& ޻`jTz(}kBr~ &a' anB H(N-TF轕s 2Tms=$w+.6Hevrh 4nEoCϕO=*2idaKf߆Me]Ysb 6i0ehvZ QB"ʓ'Ku:OJi0@UgR.GmCaDؠCb n zZWқESnA։rn>TdhZ<#Ainάf<ZJj|ch8_j|JoGv *t!\|c䅙3Nj%[Z2?U519+4ϳuLleaه MyvnT\9'@(kQc5lV[e$J&dzTIj񍫰\y5&<$-$l&"rQi%b, 8x fGf)+biJX~y>@a{4Iƴ8@e6jt*q@+O2E6{&Ø~YϖW@rf*Ioӫ$ޘ!qns@N.He* K.B{_Aژ=VSK-NzFՄIc rfͦ qCoZ.cxhb)]6]ԃp)eunfFF.zaN;i<>KI4gC-gt@ؿ>.i\ۼwfFɁ+'| ؾylpPtW&qzx73h"<IbDqF6< +r f\H䫪ξuxeoJ0(z2H/N3.$ĕWXiXzjy0dҨ]~]fp.J$<,\YL*BH"x$5ߌӮHuĚ %P"x <%ݳE|7hoP3g` if)ifPs@ͯlRi*|/FWJcȆF XVMά;'F5ݵJG ?L_~b>.znʟwgѤ+(OQ9n⚀\M6#uvҫ;]ʳ}q$j+!]헿Fݔd7~_s/\p9՜_N72&BűšBX*M@rtLQƤ1͵"QXò"cO׹>Pg`n\jO

`DLNd+t294uyAmOSyݖZ#*mK/\􅫗pz-uF"wORбАeD ,0Jj`ږ.eK}&3GbKY*%D PEDFVwB3'2hKׯM*јaTؤ^QpmWpwUwWri#gϒA)JX`]u. π5N:y k ܳ)n{<sCݎMc!#N}(̢$tqS&q԰t'#P*HZWmnjOcuWW\xp{Gq%:*-t1ƪcѰƇrgATy_1TUDD C&.񂌉{,]ZЄB-~0aJt6"" 'bH4$߃p_ٴhCSHiyN8+)g)`KZ) U`V9M Vt^"CپyoUx7=UvܽLZU W[5$i ΋Qy"$L]Ƴ Rp0a:I$%2e +`j93#A !AaNŒb<2BFen 5#qv\(h"_>HKA|Lcc~=K18qFC_v:| ;,`OM0s^ߋd %7&tmNI|~ѿ ~F;{]`RATmTuc4X ?AÏFڲ6,&1̭\blQ룅΋}mU+|i -7?}_slGc `6-J\[AQA1TZwMtyY pr^n(47幮 .ZdVC!ܩ+DJ^ & ی9c$KC@O!\:&G^0 i$"TJg8}T&8p0k/PFt6E6Q>DG</AIJbш{gSbKPQFݧʄe`OxXEvn7b#p;!͞jHd+נPB[$p>[.OjV햸T[~?_A 72,2)EA* ^Z*j"&(sBƾ|t!tpJ9|J*GKt @%pD=Fw7/HA4Zlo " Pˉ"D9;0^ x*OD]~x7u`' UyVLI4 pŤ" e" hUb!cdNTD:oBTRoM`r+hwRmA(/{SӌWn5U \LYDT&Xh3xBU)q6z) c ÷2i{OXi،neڣ{>-ܻhNA%4V!F+9YY ݐq/R8O9ї++9ghDrazx5:fZq}O)O A4"_?+Sǂ 6CY}CBLpCӧm͇>T3~.g{TIL{ AJ7s %s {eDL C ^%0&#T#"R=]U0I6L34cO/}eji`a\(caH98/)e1JJrqsr8794}_sUk)aјXh#mF rݎsu{H9$$ZqSx ͨEpo ]/zuRp62rXX>&3->&zaԊ"neDZ҈\d-~VpblK I71!xz/Py_FyzM mCm=#MR'M -X0(GqL9-ݬ|3p{o*ZL#?:"c'̞ŻaXX! ?XC腘SWWiN Y֓1BEc?AnjuωgV`v3ʈlb~w[JZv庬Up c}偿X}EJB :b6FY4E"QPGDRѿN^aX_8Q:~ƒѷ/弩l>4A|V{b7baz.ˇAcC*~L7;NY,pa1Ţu!B  qT+t Эʅ9t4GhN>kH]# ƼFJtQx&y۹K%|\WWoƦmFB^ϵelE,ؘY0hLc( AԠ%`,)'SR 'C=~-O~ˁ=c&)D:_j̴X"0G\lT`Rbnv2M[ ysIRh= D9[#DUS4cͲJj̑W`t۬ɘoG~<179#ÉdhLQй잢|T&aVYH)M5P3bi5{4` xk*P4uxuܤ:7 [s՚ }j ?7x+0d*/Wmfz6u =.aI nVprCZ̬7FԦ^;KwK$<|Zs3kh&7gd]bM FS9#UYc,|$^dku uV?e)#efжm_[kyAZoqY5Az9Yl+se mâ dk{~ ۜ7aս+/H%kQҬd NffF'zYͿமUeWf|EnV|kmQ;CDσ3Ͼ!|G.Cf\ae皾6ߨMm]2̨F7ܬ1  xC]sֈm̈́ ' YaUZj4$ؘN| ަpu:gpen.GN |WTA[5ij׋v;\{ZHVD51BzA/IE߿Ǔ_Co6se1|R8BAEuTdanF1K"2K$63b>ijo!5?Ar6M`uÓx)Ϭm5S _d$j sZW`\cCkŘi9YY\z7ReS-͇tT d3ɬt~Db?kngT7m)8M%?p~Xx#Z$A9^-1:m4:p*.0-RT;)acGб&m~_?~X+zp/oC8 3l[My;sa>w~h ~9}ּ(½yܛQpY,Hť_ޟr,Nl>xwvz7~VJ+)D 4NCaJo|NRjg#_5Vn02DO̕Y?dW*Aq|nUU1[)0eucG_Ǐ7m.׬p\{/r+Կf?+(y]pXѩQGOWwa7IJ34s{ Ȇe/k6a+ȣ8j#IBO3Mwއm v +-^M%uoIIţ,CE322/*TI򰮗3aؙPK=>e;ֿc'|O# kA98$8&"/s;WBR;7JS<^#ź '`OLwL˘we;ZgQiLc(v q/Ӡ1S,Bx{k-)E!}~Xh%}s&OOqũ$ȔZ54.lVFu*6J%Gȡ)a9Pty~&oX6 uZd"@[uRyڇ[|a?ohyPJI{U8Io4ATKϩe$՚$9a#(*]ݡ4.UV ]ȇ"z \oATNp*RV>J8F36U…8TH7O0cJ%߰Ԟ8>kwŷu?"s ԩ&Lqq5в%UѱIfdѝ;?[V :\đ0 *WaL`v \zFϾQJ!fuاԏ3'ǦG"8<;OP;1U%`AgC5/!}ܑn]0l7B { A,*tTK.Zx. j![ZPk̸Ƕ,sđp54.optVܔuI};TeQ+|@eD1$+9֘Yx0\Cy,mRaGMUЛ㊤g"w]YŹ{ǔ`i,cY%D#+9j⥩G!렊'Jht7?1x?7ES93Ϸ>#/xTG Mƃ7~lx̑S͇?70Sxk0էRi`J khm 2OIJ\4)khd ld |Zvn;5h婖PNe<)Hmė~zy軖q璚jhd\ @S=)Ucܜ tnHӯPꪡ'k8& rkh\\`a71I_5c*qR/*B%Ȳ唖õ@8iPٔtΘ´֦t3ƭͳMR7w쇼1%m͏3(|BPzsJʲگgE`3H|٠6z.eNiWUMӷI}3uۼeָҎ]PmIj#s<\ÔW[3(#ޙ}2F%K=TFvjNhK6yKqg8Rj0;Y K.GPP4MVXѨpq _f'}ݙju ^O(#G%>vkKʀp|/OKJ$̾ pz*I4@иDzbqe[Z=$eW'q4 [N+>ZuY4@y{C``|O!&)ʢhWi ( & Y8u8$CT;-w^8+(ƞ+RhgJ|Z KDgcx?1od1xú8{-8'{-C ";ފS ${E3[\6ǻWWMzncgj!8v]QC|cz ^̮Ys;_em!y:ud;):65Y6ʄlY=k+gЂQLw{ܠ8t୩0;k"ٝcsn[Hn42&/ Zomt~:\˻Z3=MoۨjZU Ӷa$ߵWFtu,GrqgПv?&{X$V#c)G;ƅܑ%"}Ӌ?{0ìO7xWwGPqY7M{_JTi5AF.?h_X[O`<ޯxcz%sUMn՛tutW|n3xG$ˆBOS|~o"dJ3=l5m==ڦjgYit Ǟ_v:9:H^?7wcova ~բcD]8D\jƭtrnW2G5Tz^"fa~hmB G䤃[ V3Dұ`Q* +p YHZ)H{QxDM|O[]hz?椸A!->< Wd^w]gp&޿"O1BLہt,hiNFK龪qXIyKwIW;J3þb oq'It?izN\1ھW=jQ[\;H3hE:Ű~oc!ixj am ^ܼ alVLa Z}"S]8xby]Gޚk>BqUG>|K?jPfK8!\d<~s̙1Έ0d`MAvI*S #76YbiYaZ-ѮPzuFbAd: `Pz:+6+;QNѭYBպ)}ak_ !uWC`09 Tr.#X4&X!fP0It{~Q=Kgh`r V ٱ:C"浊/"R<)ޮ'rcASGqtEZOݟ\0|BQ0Hzchk<'ez‮5OeD1 i ۍl0e^+sv[`{*[Ⱦi;oGN ƶ[h|cFE{ԤիmT9A8g&K֥YFxf#N~I!HۍRϨ^ _4XCYca~Y-CY /KUR8omُ3˜[b;*y@nQL"Jx+RGL: /`0(sAh6~!XӪ7i)..?_c_2? ~Ӎ ŃrH2F+^Fɛ€KaTM l\1\1.u[t^q խWiGlNud|~@R#6ν N=MdE*Kڟ\SI5S͹+qBUE]tiN*ϛTEX;&¢<|FG3 ƨ݌D̨e@LdCVũ\&v3C4i)¢ ]/1_Zm.EA׬<ϑ&"xcᬪA(q$t8cfqtĥlJc|^#E2$.1?܆QeMB/ix'S@hSWpՔӇSNkdQ:\N  zYcL)g MsjV`;D"}?-?V0␉r*X{&c^mJ%8hqOEixVQ.8 $47c .({_)feʹF)[XZD>S1[K眍iMVd׳_xCsp:/jh\X 2u<͝}*T6^J h kA)>3'0/D`߿K-M* _Ս=$:Τ&F`EI"<'\k4SF MД` )L@L3S-ӥs3]nzͣˤ}J66rP lpQX U 4pz" i1pPvVbZN~{AY{'L.=> 61b{I|("e3@L ^FHe4Kg%gard`r2*\b`ua ֖*`Ip+JX EwV" |@75wQ{)"Q)xES\4( #e)+@^pDz`8:0xTK X*Hj0#R oR<3ē]RɽA"`)h ' 0% dtر@;I,54.tQ fLrF/dFc ~ 90Q&" 酇>+zب"{p%6D(jd@mABٽ,C:5hĞ٩:(SJ FT(xNZVTbׁǰC=$I_v)yì 0hvBl0IYn7odJ)U,Vd努Ȉ/"@?jrجF.$`pۍZY q)6=*K^}՟n{ǻj]E QSK.ʯ7Y54^֢[^t$'Cҥ2LKW5G\nJpF)T>Q Xܩm69J hVEh\L*B9irpDTh@xW:+gh e(>ux~ɋJȌ3 (=S+^{ Kr'jJpk$}dR 9ow^u&Wli!ͪ7Q][Uqr:AFT4JEA'::fYLQn~yi-g?`&,zge(7BP*":fm;qPQ`!hr&mAgVƻkIwX%PnmK( ;P4*.)*tk_94NZ!9[ҥq3t5Cu+TgQ[Q}ׇ(,Ig ТZif7Ѐm&A T=;O/X#_)F3|w.ܢگ]?hF񗏛h_ZלKuS꿯/dXY<7w[IݣU-;Yeg:FG'8L +$nS:9 m)I"%xTR`D6\s"XRl B$ikyB?.ҡ4=+ګ|ZI*KwW>E5ǬY;D/fQLL+m T! ;x2|+FQaCԐJIjb} 7𴚠K)?R(D|2w4}~/!e*O]AsP&?eAsOWZ;< mTh3ӀQP] u Jv#mZO x+aOրVwvcX海×?vu p֪VeSKͨN Hܕ8ā2(gsW17dyus`BWg8%:]քGߏo ?o!ߤ5BPF"-a# sJ39ǭeJ{ԭ1>[.B[4ɋŷ63s~ݠQHdFx"y&/7dp4i n6j#g4ې\)+춡a3nQmA1:_l>$+eE85R m=q[ ]L/͒4-iE ,Naa(^ 䊰hX3#)J&}Coc8d厗B8qZ1 Af,8qedz)qhZf,ÍIc^mjz`fz\ir7 yQ=&µqDmw Q(|NcMeGBPR $a-Du gAK cwV5}f?QVgRsױo$) УyBL. D " P00OhV7]@=N&G՛Atө46 ~C?~|tx>aio04fFȋWƨ) E0CbPܡ놌h*wao~=(&E]RD,|R;1([HSl$>EEBp[J?G~8&~ri'Y1YLX=ךy/D zbfY4W!< ̓?Kk躰Cxb<^$`@J) `І];XPAj*Ay#?/˵Z)u=DњFLc@eV'W`@+wKRTA yV1Y"Pq y[jA<@},Q NUVa9EΣ~$s;ngHq`R- wxD^ m2k7^O]mAfY=xbz`hO%Rze,E·rD' Cp&)I䑠19Q !":fEʑZ|OR>nLi}+յ\\3RSj.\|- _;oxU?iz>F~?{:p3Esgr!n rEPVd Ơ袘!q8w[ߌ`zm!݄ηt%-)oImٵWlJŀW]{H-H.NfY/R CWzYف W[jwn͛- 7V!ʠ{ 5HMl< *:ΣK- 1H7Z&悖8 ѭ{ 'dS"v͉D-Jp&x>S[Ih)X(<*EE[Gn{ עQo;TG9??-&w\D`"<,*x+cL\>Sށ6DiCg4A*jm)8t€J҉ژ-ƚ?s&ev!]lp/.1[dcpP9Qs ̃4Mqv}{4ou|t_h(e4A>%R.Jw85QYS9Eif X s/u:y Ф!~G/Y72ICP/bvHR(wI1͢g&LQ6U\!;8[ !VyCkgb6Ȋ%G`#q9iˉԛ4|_XĮK&!BkPIF04"u K7,yVhmz ϔ3TX#2D$ޙ>R  ^6H,., 9l/&᧌ki#FMA8O{܉v&|˲,nRI*C],il9#$Tb ㅵ/?s/ɻl>9cks[SP$/J=! v9݅a*?#GpȇZn -V(Xt".cBh:RP԰bq`\hiM*4,cМ>rH*UhIf%4>χ77ё9&OU[d$ȁJ|jD+|ZZH2M}xՇq6=@WeTu371L&Q:XRxw',;M9?|#7WcGh췌k~yWb~tJDuҫŨ/ChG=r㰮ry,ŵݧ4!}bv)JQU/?e$",rEJĜcȑd$O{9lhfo*{Z (_3>*)ٜFQW^ݾ4RKM>$50-הe>uIû>b;ڧF F;w^.G蔨THέ+~/因CMa+^aVy GRx,3"VĞ?OSYVl{=ze D'/ST:RGmeв(|H(|]oK g&!)!^$HG(@~&$,([-#3: v3.c x` EMir[-/wI;L$>BrDQ|q/IK[!蒣.$U QIRhS+ yShR/8/sHϩj.Oj7:y4g%A{Y9g,4[HKXCQ]ٕtCD5N޽"?3 O&~^(LI}a/~|: ƺELHBM&p+zTB 'ʐ$P,,`DFÙ)"AdO^\|4j.;2b*nT[/ٗG=|-h;C<$5(oNZLғZ5Ph=M~Q0R=;F%73覰0qX~E$G\#nkvHte+@`%ޖ0xX@? 9~:r"g];Nvb /Eԯ_~M+]lg/~}*We[^mT.(/V>iŰYZ|/n"aK#=)avpSF e-*;}5VS+9~b/`;SmCܕ4يIu]Tӌf4Y.Se(1~[ğf| dㆰ l{ cH OYaj-RI뇷"qXm0*`Q gV[SZ#}5ҨE趎 $ՔL.½LlL8Ë#Ts5%73:@ohJ1pm_0I>[h @a cw =J-O#ED&9H_8`N ,(TucBE;jPz/%BqSt(F޿wxg~6;>q8&po3xd' mC@?XPKK(l5r(ιFwFF) ! mprN667HiGDgTdSmyh.+9Jn}d ߅c~> ׃l'KF0#4(ml;Lv]LFss.(3YQ  ZRka'x0fBWYV05:% +U(D'e^PUSYٲ,X#%H^T(+bj"&-H=rXJf3M-cjemVB:{V}Jz̳QBj9qo*` \'*a\PL6(|/-<bdM gkJn';ILwO\O!FW9K7~p4% /F7\29vҒc (̜UID}w@+!'fY®"cQ|ULZmFt87^N-/O/OP^F q0nMP9OͩD+2\\k8**\'li_斛 V bp&S8qq ^r)uUzb&I]%Ŷ]Ms7+_7/EiƱ{|>%ْ,X_I5T7ISd9lIKDKd"QJ ԍs%wGe'wl"^#YH!+S#nSTSʡL7.Q0tP"Wޜ$$E8Uʈ[v_݈ ZOP)IAeё6)q!j艹MmqQbX[=1FAx=Ώ5|Tg71ឬޕWb@] Vbwx'Ej(vv+7VEEMmcqAa " U3?Ђ6ڪ)88޽,.n?4 ,xS}(=O!K QosGkznHW_ryȪu!X3dE%W_nߐ!nXIl1r@>y^^4JVjw= 2P;bk.y ~6+nv3Я;eOڅq{v JVjw-ɸ8><]l= {aӿӸ{Mvx.;{t6mb_Ne4'%0m,fP|UԞNF=HƝ+GWT:)! =>#[Fqܣ(4Lu>ͺG{ts'gϾ6٦Ov]+qҧbL~q-Lɡd68uE#RAc-Ơ2t> SF]?9|gu%9b^fdYb=N2NIL,tPZ\LiS-d?$\{\:S2\s7o?Z P 3rcMֆHO&3V"j ( L#DޖџN#Pr*?NZKAӻDPFj)fbĮSvLm ZdvV{pUb[- BtրaWHƅbT}b%ָBTh\/Ŵ=bu=)ݧb0m1B@w^*D:ـ6ol5\6k ?5TSf(l[& wulXhDn<,J:l{J JB<.TtfqɖsYืhLrUŐScި2H0z0KS{׿غk%"GǷ~e򠘍y`.X &&c*i%CdR}t ͛VZ\Sf١@2m~Z[ie}w"1ԸKz SVxxl%-o:c1m+/%L2hTgģ҉6ǦC!xxkˡ.{6oѼawo&xӉwo' D؅7W[M1UGʥפ?qE8 dX&+xtoPd_!RߘVٔ/z g<y|AK4'`Si8v]"vsv~xˆ;Z>\⪡8+y|a;{ݖӊBw;2/poNj%8-@2nݥ4X;%4~S^Aq$<yz'`we~ U2w>M}"N?a\<+:v%Dc3*MBvdyд(+FFF,ޅ~g=wmRsBɓ S"+rXAg86%5bGOclNNI=Q⾔s>JyyLYs,.!V{[}@ "1@Y)B(T3?./ү'/9؃K}{i{M_Ya `R`*iGfHK2%,8E?CKTލRN\wi;*S4eonҶ>jem2Kޟe~ Tnj sJV|;V1ױXI,P Hż-uیK=;3%sHʚ sgRNmPְ ȣUr>Ӫ3Q1w7oc 5cLƀ2ݯܯ<&$ ?.P1{O3jy8xyPuw +SO `O[lC8\Jr@~z/(2VtTb!*07|6Mak-QA_OZI3t|7a8<+n֫S\0<|pZwŁMsg_f'ikZwl*uMṛ!FNQZsc WI<#X@!,wN^km{mB(:4ńjhlr;ON[/Ĺߺ84xϡvDcV%gʠvXMF)Q^j3)"4I&ҸT\E ՛`bN޻4Z<ڤb$IK6][_RG{320j؈`"|Tw=H[XJwWcs:-M&$ӟo?s6opvQbrEFp@gOng_u_LDV<şt5bBp H&^&gNZ9a=km[^龑z.5i xi{cM%kx2fU>S@̂sK(S+)4HFn;h\%2o>^ h͢Pӛ{UʁHu4ʔWSBdxX!E$3S55l+m(vJQC9[-*qe)۳geφg߼-ҝmf4%0)F*ri̦ޛkj63HVMøS ֔w[yHlXy!.l('1l_cքF_c6q'Ln);Ӽ;aW˓JlVJ2iao1nZ^2u >XW#N{5[p+fCt]FaSjZw@]$YK7y\t'qʕ[ p Eؽ)f5Kbق4eL2jrlb&/_R>4 !Foې䜈#j[+fcjY`Vs뒧gI{+|] G9C vd 6˓̼_EP[j-$cHm6Y*_P{4*h.Հ1+ku;pƻVg;&]rkF"W>NrFgP>p 5?ԬOO^ipE>M%u(hgȖX=mu1M E]aik 9ĕKzw($b>`xۢlcw;a @5VP2rsy}~z>D֒է.fZ8[O;љcL-ӡx[aV}xP쉭6DJ{ ٳ&IUFСep% GF4Y#: |/aS)0Ѓm#aqm zV9,Ӵ7 hv?ygr0Hj̫a|"[5Tk7OAgM Ȩ1h7c4-)Dzs=9[.S[# ±.bN1T jԋb}TV[W# 5+ߜTPMn1 U<#kVrVEVմ[%kF]?\4Zk49lů'{ esO>ޅeVO?x:tt`>z_z/m]Ʀx{0ח%>8v'?i&.Ca/|P”׷߇͖nAWvx?̊}c&%MQ7VZFb qvj|R氳ha=gI{NmI8Wζd'X;8%m4@Nbuץftt(Vzj^jֈ=lS,]tL.5oVXAK-B>[潒^Oc" VC,W5U?%$Pʈ0 EB1"E ."1+ "eF nQ$(Lc9(e LAHPlMH|D&H(Tetw6aB'6}\2l%2\BkR8$U@ѓӂlh#,XÊJ[#@{$`'@Q I+пվe1YIVy5Q"dvpLW Y nh5u c ü{ZNe{,g^ ƘP`*Ss}*d9 ͣX#kr ?򳚣^=ĵYcGLJu5shlfPʺAM!+kf'!QAKk{[ c_t,wB]kҬV(g^مV(/k+HitˢtSϿ_oLxq64ZU2VT|:?9P"I6^l<&Acٓ :KM-A I,YR Pyڈ1z ZTRF JER:)TN!$(xq!3` PBL6%ذ2WFXh> }kajɼm?9\}Q)&qx!]ߜIo_7d 3 SD>Ɵ^}'_6c' ڑS?%mz¶ʙ~1 Xto5Jd{IrKa[?S`/$__]0Aoͻ7}WE|9q٢PjDlê7^2h¶Z:eņI`7G&mH2$Ϩ!ZΙmCHh!FJyĖ̪n p.Xv'V g^hIھH!#+k֎ ٳe!nHFEF' ʴo>@Ut~ud}* .>|k;flP 蒮 MLaԹfaRQ(zv-ERڋhDqڴ^0 }z8} b8ߖp*9zvs&[u_$Z>̣iT9+ĆEO U\Z8[ gWH2Xۮ?|ӇgӉ0sw}QLl-e6#3`vR4!D6䛬7`bZgrJREV,!{b*2RNl`+QGƞ9y%[B*W5- ;';=Zm~(&j!kAZ$+5 >$jF RōԈԠB*b<=[oQ*urj9yfd>Яg 3*nr71d|UW.F@'n2&mKۋ6 < 5 6jy" cskUO] (ZҬfTQuqc ZkNÇ=Z㊮I{>[(`a$:j*b:&3LR:ccZG<:#VqOM?$:եY: ױ‹mq<@#Y{_Xuzux9aNRQu }߂ķKKDnJķiJhEA͈]Jfݲ# 5:$jZdz5UzeVSN%koTu4R 1Ztϗ5d-ykAj_jӁ8 1 |bZ?&RVCCyOJF[F}ZR[Xscttϟl8B@2JSrk%Gg|tÒ><|Xă5Iبg_EլDa)k78F*q]ktx] ܾ0R`VnK BgGrqYĽO~7C\#bL`ȡFM ECnGCϐ7*+M9BZe\/ E_rpSrE<@H:Ԣ CTX҆ZWS5\i>6RR2efz%ڀuG$=$ URZ7 aJuZMִ*(i\4EKjb&cOc,L; {\qF)M̎Z'ҹN=tW hd?xB -H6i;j7SۖM%q2jc tVװEt#ֽ8BR05ON}⇅cG׻L'D#]\>&QQɣ1=G0~H? gOdrvzy\rXNcX.n=fz:Qf5ۋٹӐ1[][YW_57-9=^bM<;60][Aew=hݕk[0I;)oX| Pd0/y=ȇǁ>CS"Nʃ`|A!ap]Z)@{d:q} `+8$"~$*w0v0'kdf3|m ` rKj)fUMF< xXI][sF+,>m _Tͦj{pΦTXiIN* )Yrlrt|=ӗY=n-V,kptY~LZ WQ}[]v{n͑F7VXoUzc\._/|{"H`>>F?]\G7ӛ{:~TljLo`d/&J6Ι~>ݍXsXj𽬺{%`ǶskR?2n;26pK,T]p#{'. >2/QBTݡN~©I0LIV%M?1ȱ5S?6X -0N)!, w\awpfZ>agт,{5vI^7^8=Ԏv|FA0T~aebb&5k,??_2n&ٹ/]"c9\py4)%!8D sk95|#,52^$ 2Dy =sI;Z~?ї0-M_Nb7uWP"rLckx2SvN&\rY!F ޓJ WiS>|pգ̗T2[f]ľayWŲ2%&>.quRP m0.Po hdV1(LX_|uèֲ Xg+Q=K yKjs{a`xm>z(ԊOcw-]@aF)YbW,5'@VjޘUEJn̢( Ȭ2jݵ2GєSD3*C\,0j WMhwlύxI-'PA|0،3ƴfRLNs"F 3g'Qn{f1bq]y2?_`Ku؏eƄ63GkO;x๙e3{?5z}sq{R3~\̦TZs%7 r+ u+$`eL:) sRn˧o|-$QU3bn1a_|oVRT˥#`r:MU:V-/aTITEno y(xC}I|#pgOu%a']T7X{u[ݕ\]ӟNd\RzVF,h y"K*c7t{%{C^:d FJU9wX]J/͟_/|1!BvW}Rc]w촦S %:IځŚr ՊM2_M4VDg4LSa,"J5 Ê7`FQ~SY=zNnj(zҠEwt:\;ؑ_õށ$mg.˔fX> H'*PFFi&|#D(Fvz+uNE_FS?= Z"%fuC43tzN> W1'H <ءQZe3 uXZ"d?P7Ly2E) l<STq)4ucZc4SM9QܢEHr17 ͘ s>r8̎D~" MHqx:ɏ&q8l"LQ%Sgb ɚ]Ev T,5dH(,Lɚm7E9IfdI&$M')QTd&kcZpܿDKa5hd>&/U2HU HY\ Mi3)IA RZ>H4aF<&4@_, 7Mv+izXTai2M`U00Ҍ0Ld6&*>uճ2M 5D;ΆIA4}8U溩CۦWR*oEI-;юsrOmsRu釗/Dq&?7+eªO+#@2l/e1K\N>8|R܁,oױAq ~l&!LE'i)e3fZe`_|5iɦbpYR+^O%h%9a" P/q;>z[u-/Xj>kx0ocmx}!vX$Ld Wktt;bod/zJK=ğ6 ?~ }+LyW<ˑ%ASDe um^YxS֐̊>O67_37nǠs H?XB30kjO zUcP00'M "& "5FqMStވxJ뷟ݾڴ\jDe/1nf8E(JY*!ZX1$5R(*9̓wՇ[%åc4A&HU $X0\8̉X/JyO Ua]{$QW> ,pHf9!c 3FCNd`0\YBy)r9$xD$`+iGH,{o'dmҮ hEM{S^4IҰ-$ZIGp^Z-+.b%E&`abke38T[^öh7icD~{MbPzHz=m׏`_[zpJ#>yGAgzXy+P y/`4?%Oɂ `e-Vi_H !vYVۤzS^4IҰ0%VDj-(nXjp8q81^0xINeQ$c,Afe52RttÄ2Q& ^hjqEg85P7J9&vZ~7"߂^4ĺ ڐbz/~DBJG1ƀ0y @PRLNa].`ox$&*I}$I6s5c3sf@i&I(R@*R At) (P))d(YH{>|.Ih zSI4iXۍHiֱ :GB {'86a@:c+f!Qu҉>d\]UNH1ݹ!wV-]/㌖"E?ʨ.l3 E$;vEYe-m>G+ !tu LcSd6ս ń`!j Ȼx spuE<0 wEE]MawKk+߀| DRL1ȫs'T\tK8v2CL HlKpbzQ|D hDžV3̩F5)ϕHy *3l9('Ԅ^5 ,EEOw}5݀]*r=> :@UP^?RCGqX1@u-8G0D a^DTWbYYu.׳ue[\bSy<D_ ـn}׷>tz+ʀ"Ed|B鸾*vz6_h77)`YgI8m.M#D%5k`Hh&QQ2ugޑLЀ<ӉI-EwӞƸ ͛:&Qj G\Ӧbv $gn/CX"D6!Bl-23֜2)[1%P%`^r]›NBg0ZjI ʥ.vxc- #2|j='ⳙ$h0%Af/./U\_^h|q%$wX֚F`Obo:~<ļȬ,rX`X &F<_EpVJlK7>+iF =hfњE+OlzUc9KX1,NG̀9㩝^:ڸ . l/\DY]^_4_ /313J$PcOHfN`#T,t_=0Mr;y,N׷' 1 ZtBcŘom#1@pAn?ӎpM]SKl?Wx<))+?y4<CU4 +`ືLw6>fN}Z]tg9cԻKZ G(K?# ^ `ſO17㓆0bΛڄQ mX,|H02>NGQ(s*,AG)ЯMJXff:qQcq QZlxQFm6$*HQPouW4NF>T`h 6RVhkmH /{ۡ`ě[\y|,~ƺФ"ʻg_)LϐE6zժmw^('Itʣ8< we+dPey/oRD^TЏ 80m:]ZG `#"exWg3c X!&tY.WotXy{f٘wQD9=tF1OnhxW~/Se*Sl/_nfpYI\~Ytj XrIВpdY2j藴5.狅@dJsQη#X?ܤP͙no&G[0[لϺN °|ɝO=t"8jU+w%T.?۷/ş?| {ml?9?_]_3x<.Yb;1xZ67;I_V}xv~82)J)-EeN"2.̼哶_KcY@f-\'˖O#KAϖVkɾ.ayXkr*VYQaNw֤Gmo:+SzNqH54/-gXPj:Y t_FqKs>;?{4wnu.oD'Ls*92j7Jk"f|$7ԛ]ÜgIC[[ve$scCcDXSꫯ.%Tċя߽e~ƚ$1&bc\4JؑuG|53ty̚%^:`u{d ꐆ9(ak6"TJ,kƽ4Fgt{'V$R ԬxkHbAv5+&j# ( &PkѠ᳔ڎ_KI9&tO/hGmN%Eo gfĔ>*\&7wr D{j.:E* ;KcM>S%uɍ)Aꐡ hur[ܳ@ ')\*4UDd|k3"ļJ}(9W*j|Zr !jY*3:M3BURJy8`[ ]~ 0gUE9{[rmφ!X!s09[1.k!Z}˻ >ݙ^zo8x;Ȼ;?X{x iG҃y6o CM.-2E^&˄yDl:"Gňԑfpg /ui$zS:dܖ~EVR@+~CFRܼsVN]MHuRdzO9R'pldLcK:|8&%!xo_-Q_=g''[6WkgzJ><)m_"`{RdK˼L/VWHOnN(V?_pK+ %'w0#0<0N~fd3$(F;L] 2Av]6!X4^X D$s!ùeBXǁEKK释/WFBM5P$/* )PB`ڂwn p{ZV@j[P[؈]&ڷq18?hL11z ׯg#e&[w=ZSCJLK-ty–X^+ @CcE'f,TPRYǾ舥r?[uLmbZKB$Ϲ!B6pg|v|nxƧ Nd1؂vE zSd ƥ]*RON,GH=-) ¤?rt3/_=Aq ee;EX"8ݑiVi,'_=K 2zbD2Պ@[Đ *~®JB^D$PydP!r‘GG~ "2^I5gri^xy{ԥ嵟0 3)ѣv{  7w|%gJRM݀7w{0d^4~yIZWzC׳Q.ԕWg[\i>31ۿ.s- BK|4%wpމVj'ۛw. q[~dfÕ%f}fTGa`&cki :l/~YhEI: -~Kb;h9 /2 vGJH^QwW62#?]]vuT8Cӛv ɨ<1N)8 R*Fkp d!ET_Et'`zېQ1@TXsH KH($|R#S< ,HpcnAY/)R(I)84T$T"<_[NzL<~W=iˁp<jymASǤWz_< ow?cF_ 2836LfC.SBD/Se3a%6/  W7sQQyLf. EeZ ?T/xgQ)ٮ{97l džvqFwmw3HҨ_(Hʈ@dVnm!Sy1d$ T+@2=R72+k:BJwi#v?Q)}t+&Ճg3pݧ$qqY[iXq ځ:GkMZKĪRG2FKEzɱ[V=Z *) X1`p=,7]@m%(mC4IO.#lGaI`(\(:b[DHųqXSs-%j$SN8Plabk'ʥ(*KyFQ+YqH$Ni;*CY%CŤ,!#DB)ͨQSIfX݇icJ⨒104#& + J5.1o{ه@Y݇Ť,!xYv9ukY)YL:fuppJHFF3#>@oPAB+6={{iˎ;Qy8 P[+)0ǽ&LRp`N~嘤lH\";p6l蛒T:ѩ札ƀ܌-qAgln|^%REJ&c4ȸJd-znrWdM!ڳ\)oXd` mD]gVg}v~v5ֻv Ӓ+_G+ղŬ}C'"gV o1kPP/6>L8*7V)92ɊѨ*+t Xɍ\d9oX?y\QcA>B6gVܹÆ)hw@Ծ`h? lN`'tX?ŴlY焐~sBHYN) i%h& hv|%@%+bZ„,`nTIwQ+e\ XiTQUfN$I!ac@FRHTJ+> Uu&W1nU|{,<+9dYE]bLX(^ J}b7\{:;UNz _Nf]7Ɇޕ7}%o'}6#$Qzff7 ZIo[0?[6oǣM__xoWӇlX_Fw|{60gD"&Y?ɷ渃9wO~)'u؂vkv5?Xj>,9ҲR :]o@u{x_H/ث>Ԓ , l 2ۄ|Ces/OKzk{tv0Kip4JWW8Sl{Z2ƍl8K=mp,Cp:Wcp&R`۔IB ;H/y0"QGJ+(;24$3~m+m__D{~m3>LWMzL;BPN(_bg߶ xB{p+a%z&Ќ [Ԗ <6@FQ䣁UqlAͅ1bZ5'jojFߒyז"Ppsѵ[Dٓk3u!Z0w]Lf6ܽ~-QWٻ綍$CP~N99gS*<%I-z> >ݔEA 0_O73=$d7f nt;Kx}׮.dKݬV' QAgSa';JTCt`Nfw\a/.tH|84Ǚ^rAAgt.ar$u.Go@>\K~-L+kB2]a؟0\[ortlԏhzgC.0&|'>?靾MB—φ*oo=LzO7It 8ݟ.~ ƵdӁ׿/_O/fcgސKsj tpjG^ w&:Mo]/<ʝEO'g1iGoo0˔nǠW0$+̼xn^],,/=A&a6al'@/G~p{2%m{476Sxm/I}~b7;h &~jV0Wpq32bD-f]c2%}NE>:xO = PD"=EXLBŽxO;=_ẕpD疊,'E1dYse0|Icm6ŋ ( 4ྦr|1.Z.xp&czql1(;.L޽+3Gҹt+WKd 3PSjajtEupn [~?h?[kU 7y"͉ @GͱWEoյS!X sz ˷x{IҿZy|Y6˔[HUGR#s+`2HO2EZ 0V%(+}geAaZ^i(x:* qZ%VA<1`o` A* XT'q:1󧥅8PU8M ]q%@Ӓu5WS~?Xz>p=_c\Q̩PUm6pMTeUD8u`VwحҤb]|mX(|0wN&\+uj/l D<82BpRHP(cw5a0U4;VCN nywCr*f*ASp9!:){X\%V)\"FZ0،e{InIN7o-xu%E *6uwGmSfD\Z1ԨwGDȴ"Tv_Gӷ nLLqL+9dw㢀3ҰaHŰk`U c$UwvQ\.R4P=1 _hU286‹ #Z(m8dF(wH ) Zjb:Rl%$=.F"!ejNh5SUfodM@1C٬3tA@&x B?{&#qꘅ BKRޙKR|ֿ&T.~)Ͷczۋ76V,h\!ˢe]K\~瞕tu<ݱ+}uR5CD.KHCx3uf3y?z{LV{qlRf"p,CA 7݌;*lNZX,[$I?gWYPf.!<Ȋw |Hx'l`tw\Q#4 h0}]K)a'a@rHth{bxXџjvy xtlߘ_f-ճ*}{z=>K_\mۣ)FVD"WqcyO4Z0rc Zڳ !r$H~X7~Iz5?<'HrN2 3ٟ=on_m;L-h(gWg4|5M{?h5d/B&Nd0}ZYxL?e֕>_ &wpJOIM t05PlhJ-`{70.ݲ4jlr[h:xA0xR#Pc#lYz!'YĭKis+p3 l,C!]"p)_e7R9(PD#b9?D J @"EҖ˷\A\j|[. džBm@>(X@<Q-"cXl~;\%t1 41E**W|#FKZvm'Cm9fhˉ[Nrb7Ns-'.‰/ot '֚P@rH{8Dj#hxk1k9'Nmgɉ]̥ 'ֆC+ h(c-ɗ3$-^t3G Gb3iaЭN #@1dZqO_LbU^EØ"qm+7P8 %'\^AiaKPʐ]XÏW% u9v?c)HIEa (2D}B¬j;G՚;ލ9`2nPVpp\Ӥ-ld}U6.:Dkx)K B(I# ܡK>OK:K l䜶nʗm%[.rIw.9dK\咅% pI8!F P.4'<)#\kq]Lg%]eyS[Y}E#i.;4v~]X=5ɞekjz f&ůȇ!ijˉn_u/l] [֥ץ= G( "hpOLc#(LZƱTq@9!߭4᯽vbp3ֲ/u^:7[3 4}gZfuҴ/_hޢ5:I8ƨ=Ϗm^c=y?7uXPL 8'+DZ/ҡ4&c?8 τ:z2/r)W'_~uȺCd!ʻ E_g8u0`aPFL(I"4 ,}?p) $bF{\#{^).jbzUۢ:yu%j:@]Hc ae~>%רWh9s&d4|74 zggh&TsK@6JO`̿y]t^p>f8k0U/|ߗL%@{A%^d RWt\v8/c]Jz'y7;&_5zU']2)&m y N+:KDDIgTb|)0Y@cq<̧7KC{"٪q } BXp86=E= Xd[A)y@ͥY,v.]X%6[Rb! ݳg>ۄIv&o=΀~-y%Ro*Z`b0S\Zrof:$#G5g&Vꅭ2 .?z?Fмӄta1:d B2u眒Dc#7<~p#F70_&Lah4 7O)'t^fn {K;N_ pmZfYGMPIp72c$x̹7<%KN+dnV4m0Lm 5ۨcr7a{UM6%gFoSiTTyXdȆz[5ta8LRR~Jjuzwe֚[w|p<'x3qk^U"N vtq]/矿X|ܺ$ˡ I?,@bJ./ O 0ZFe'wylYB0rUK g@aK cV9&IGa$ KX%Z/QFԧdt{s3cz]|Z\._)g[9ixC}FVT{rGM &Z͛dN>a6-hkX9{e-'MoyGU:6EsV(Z#mG@L.|6^^ ̼qZ6̲X°/TtuS˻*Ʒk[|1cu°`UR¸!%!"4H%x%IbwP֞s[;onb(v4t0N&9u-{yEٝZ 9lDSwǫQ2Ev/g.K8Ƴ=}][s#+,4yURc+tHOcHÛ8ùio u7ݯy͢"ߡ~>^>ײsi\v{ڄ -~|u;F*^?DXa=>;[ecp-^}i + 0|p q6|#S`JD81#۬zU.T6*]d϶AVzhrJ܊k|V96iorBbH&&BpJ< ieЭrZipZzK2sU$.֥ o݃f$gTj laM0N_RmH77cx ToL}ÎbN1k;&3 =LTr Qz)8xZXTX6wZm#ֲ;=>ޒ[p}H}*:׍Aw Fãev!ϽX1m~ԥWD,)4*cx9vcBcOi>|~ݻ u)"dhC2=JS氾 '=1ӆ Cq^}ueZwpJ鬱%&S.85Rw2բkZ֭Ly?s[6$)ыl۳Vi۳bϲ^a^ $D+mJ9N+1Du!smST3z_iϲ܅6(ї}yJyvN2 J6+ZڴzSxY3tMs]Gҳ梏tZ.E:P,Qpw8H3 *ZӞa ; OMŽ*bR3BC~*NI75 tfԘU-'3Nfm"~,/q2<ߎ&#|NCѯ*+:=Wc 7 ^Y~jP ֶ @zl͌J\uypx=_QjLU]޾r__=[段2=YSitǝ\:Ohy!q/iq$gH&0sD(AaX%BK)%,_87Pʉ69ZeGCkkA@nR&2 K MJш$i$LƥdAk0[h"iҲdA"&))$C&z&,$hI4W;H;Uebc8ջGD5&ڀL黳5[ךLY,;⨲ceօOVQH+CfԠ T2ƒv2@b55'hY/S puqzWdWr>r`O.Ռlv l1h@g)6.Oii`3Q4| J9*~滧iC3e9DfR>=i]7k̂ a2?^_c<;aLBAݼX1v2}ڹq%vT*+l,Mn98ƘhjE? m/9(PH?w]L[ێ?[rqC]DHNī(9/TQv|I"9zz+%74Ud2`i4X+dӕ2wy|}%_䜓 +U8j_[(8>gXQ$$ZEtK 710Ew%4)geyWXԤJ YEMrjZo̞j)Ws|~ͼCٟEӖu;//[<%}^iT^vޗw:ݦɰbs[hJKb@g۴W+5/UYmdC 2/pw7mf q|}u;4ݱ..8PyԬM$)#I u0GTpu޲r"IlZN'vlj:^zR ˊQ-^~QM" Do_ȣf%iިaS$ bW\=v., p$uO^`NlJvAs!raw@t4ޞ%r~ m$3T'N0 k!6cutH~T䷿Rw{v{zCOOun˻=kFW55%jƄ3)mQ{?v`ine7;<@3ԳS_?9s2*55\p蘸B"O9#O9TtWwhJQцPGxp8I4Ս*>)1X1)c)sE.QȻھ|4/p _)C? L/jd\vΥؿpn0EtE.9h|8j!QRy-5"sFǨ<-kF֕}]B ?0pP (#Cd\Ix*uI͜y QIGMS^(<ɂhF1Q7U7k  䖛 yduEP9 yIp aS"8$Z $cv#;oJ$5iR(6_dENVL])!xC_M8}:gRhG֬#4eZL{wNR*>=TNBUD$Id Cz13 xmVNT`L'T(1J^ڴo/^O6`rFl9KIe'Z٪4u˫yyw TΆd>_M{ިz:+#qy/Wo% OP BD6%t5r/Ԕʦڴ~xuWIT_jZ+YPf?8%Q_x;f)պГe*gz2ڀ"×3~7 ۘ115so# FhXn֍; |`%h5{m97`;tw8*=0/٢h?u:HVjYJ5yR"q o[4kMo,-֕sթw()Z^^XA:ur^*yA5gNa11ΣUHn$e ׻ݴ@MsqrCjv Y b"c y9hz&95ц5 )ħSSY jC5UASY#tхF[*F0U5εMɱe@+mt*) s&j\hpk^dm@ )RU~U|nVwւn'Z[0B DC Zx2êLhӌHR& xLGU$AT)r5BPl$h 29.#-RHz}#Vt5ZTs3c)=qhhj[e5 d:L %*SS|`?,Khl{.$ #)C(8 1F&4M`T31T`_ N$oBd SX'|@b1u`Q⏨C8;UpQǃhnبFPZݛգ * 3b$!04R[(=G4q?Q.ܢ5Ilp`(,SFkDm'-x^d8ؠ<6sȘ)]<@M{d0* 3?nPM1'˹-QcNA`3#_^DLY8fh^\PK9rm8kc#$Ruv3FYR5<@PS顇EhLڄtTxdLZU<1Fی#o]$QTD'lR<2\LI|3D%AY.Y<.؝KfTbqYRrkXLC@%Ѳ;4_SArMR߰i"lM4V";\M04ԨNrc<^2%nԗ5,hnz}>sƠECP_q֗0ۂ?j ~تx ]GW Pt(4)߮lrU?ȇ4:c{DBbqƎMx nHTqBZAZ[L2L2 A|+G|ZkDJ@I'XoH!N&uθOS;!5H<5ݢr^F'x(yGr?[k\X AW5׿9E4qyo9#p7c8jX>3ULh  !GZǹPc>0Eql+4m?>F l؁8~}1,Ho,]v`zT0 jO^{bsoXisoç. e[췦JyH+S1ET>ٺY%4ۈ*~.)Q.:ǒ:'v`ha9}O\RVԲ}GI9=ԹFLfM L1ٶH+^ӝKzs˕F%* u'3)-UOg0:RSjw»RQLyf=V-eaXx8'wf.O?'S#kVJ&a&H2‰MY"Rm3Exϥ ci$JeiWyU|+4t3JT|i9Ѱ<1Ԡ8~zN43S?bnײ 5[} uK $Lg+|Ӓ.̓WN{]35 vb枠W-@/W S_/"L̗zqV`,VkB4aG`a 7-гέJ,5/1 iXKw&PpRS\&7߿H*z]Q.Jqab_ofE?H1j+J_BC|Li_aJrhl HWa p09htrU&˄ m%O݃E:jV#%tFV JWl+EQ֬M%—YUa5Xe'>.`Rx.N.z>-A:^-W 38 }C \oƎ"^0a;x9ȄLׯ_7Kr/8~ N:f|mLV$4¢SxIr8w?x,=KRQ!ɚL@ģHxC ŻP1[|.ˉrV!rvh IS!B:=r' sL!j6R~zGk Ӝ}m~oEbxxk()כ 2Wv :6,f)3R3ߪfwP%Qwf:fw<-V "YCn}>JeJ#~8Guf #dL~z6{9ȆI籯>.H4 dKf4a -ۺ뉷7AЁX(O20q9l4P/(MfDYS 㕝o~vqv;ǯWEzIk"GL!92\vF=џ,؎: 7gbu')!2wXDlC޼~~Xz`q|`sx1BdTn7?x;drdţB˿317[BgH˳(rE =nVw>❔j<`Z0QCK',GG^!&Qi3y#n=d{['ƺk*KGG>[,GϣhZ<Ǎ`5`❰vWKU7yiopAA @=l`NcJ!t)JFL^cD/lbiX tQ5IiyaMjEĄ2]vY5K@ί;TsH8$|v:ɍo^tɌ:"a 2ΐQҌrY ~vUWaZBQ $2EqnKfpgT&aWmr*`g|9Vj;H)kzE09+ީ5󕠂u{˞ |Q "CB@7Pм)UXaL.JAeL)P`xY65oV k*fHҁ$+UI'1¥G#a=+-1=o7Owh?Y_Ξ@)}Y!W$vdc.0=drGEj./>#.e'!6fn`(3ᕀfp0ٹ m|&)lKF)!l4Bn\->n+o|u}R-I&Tc0.2e9hh2'2& < HWaA軜r6a T%` #%-#N:tNP"~I!溲}J$0ab0a;Jjrjw?$EK5k%n֬dvQnjGo1zB!`W?:x=}Z}!|'ޮWnQ~l!p6W(WWñC_ Mwi ovl&kx2BZBXҶP'%WtRcrQeK4N}sKpݯH*%C2N,}?k2ˢɭ2DƯn9cp]5UrTvWknTM_!TDȱ+m`{a6;t-_?ƫ[XS7(otD掩&NTqHJ4.!X9촱&1i:iX<8XIQ30"yw|iM{pr3Rq:,S@wnUeB(L1{GXRi[,B;FvAdzPЦv&4T5/Q8"5n<Nu%ELEݚ@L@L+B):.X-s='xV;%9vlG|{tnfv}o'(5#2Al>+RΕ='y57-E!zj^ aiD^!RRbc3`e'rN]m,.[/;ed~ar9c/Fހ{07^ P^dcAH&qG,,(qIʎoL\OV M,(S$&[ilgPIkQ++Ff@|o)L,+¤nݧS 4 Ơ)P)03iNf2 )$Jof ^OM&~ckb7=Ff ~PODQ48BgBjJMc)SWvB-Zژ>1Ļ6g+Q%~95#jFgh=Pq)P۱9(NԡDNI;'~aVl$ak2UX@M NA9;jK⊧;b4*!QN@=[%qIp~p_\fS-WnPL޵m$2l( $x'Fhk#k 9 BMl^,KHaWW]U]]]]mtHܠKo)Pec05HX{Q %C\`-H#og0 2i`Ҝ[d6P|]rP[z5wt?eM/TSͅp;+nSYA0KO1L^[1PQ+%wN''LZ1k߄ W[49,5}q3,k&?d}wT|3!{ {7n h{"@Xt Dy*͕5F\فST('u%?وl"E9@(0#sk zIBIH)E#ȋT"6.mO5BVCQ[sjP%Ʌ(xny6@$h`)JJ]7\yr5u׺vݯ\*jN`CKKis.)`!W\% "x]a_'x$ռ<5C1ogJ* lEOg3p%ӿ@6&ZjƸƼ{03DE[)mDv$E;(nLOa]5U]*-7{ G8c=6 I@Nj4BM8OBw]n?N΃',)bX('N`VJq.0(U9q˥:r% 胬_#N1tvZ ٯКQ+ YJb5jM59)n T8b㤒4(<njA%FKC1BU4-3ׁ\ arn5J-%"!wjFGkۀTsBQu M/Ϋ?yCMGhsz%^ \zv'2!Af҄[Ӓ2Ȝ6"hf񹥔 jAu?ɍTjo%(G9d"aԁ9C9rhb ##N-ambq1}I/_dx{~W~*F7NfX1df p0" :;ԱqY0  1E,!Ά9<+;Lo#($:hZq*AgSÂW-˜Z=`nŨXv,o̗*Y޸ ZlZekX=Ʌ2(wŔ *>%1,r)wu!2 S3'rq]1f臲OgSC|wx$"rg_O 8<48Z;?ٚĀYY}ӃxDM%Zle=kZqͦ~jV¡ǻC ]QɔN ҌHs>O2 tiDC ~҂x298˩LQ!r>r%1&,r HR!V' Aw9'U%4اpD BYVCU[ܖݪ-U#LO i@3 h)LV48T<( 8( 9Zc!0piG8ç!5QV9BLk=U\D 3RezSs#]z5,,t9LrlRԜ ҟvը; X.`j-R(C I(Q-WDkxlS‚McjpP`(`zсjA*],l4L$Vpbc;!J';uߝҞOG As*#9Hsc=ǑY\x0!U݄J1y-k= tv+-')U.X.gbavwfV.b eō+g(,DgU1n~{!^Wl2xs/ZR)u5C/,6|GL+e؞z +V}0Wd .Ou?!Z{֪TLwaUrV4hbNJP@szҕYgl1giY&V? a6@t)cg2(̥^sϱ\m:=!N.FKF E01 A`(t!䠩VΣ}.Sjqn B8"lrh1pFc4 Fa9,$ RhtJrLYD. /z3c'qCZbP׫RnT1o<ǒ]@Hep&"&a*Y T3l A#gJR$sk$pYY0NIܝ!=.YY+d`@R3OI%'ɸ4QLti._$1V(a(%,ZvEt 02Oۿ,gWDEyGsӈb>/Inx`po;t!Y(1ۏ`$4/e@//d2|AR4֏U`8BΎuֵ6\zVI_ W"wGzjNb"RO(aR Mx(]ͭWQUowgQX:ƌ0Rm3~ ];$n 5Ӓupγ:3c!5ֳcM#7f؋MrS"ְPn uL1.2ưtNQfzx.^ `1lxWI϶8gۗ >u#EJ[ËB_D@o(⛖EZ3Ɨ,%vq̞s|UfQ_xcC3-\ap]#=s77EX3#GRbN:"c+z`K N 2%lOٕ`),/bb:)or .IOp=%IzB⋧H0J=S#=XM}=%6RبPJ*o_&᎔x|K St*a[@ j4 =-fp;U@eh} /!z_m>QN*{)]lSP9Zũ >lgIcL=&D s}4saG;B‡JVb.GNuoabV.quiA(W#`/\w@;it }w >~wQHԌ1e B?dvƼXi_1' L]]d2!gln;!*CMSHi؞%XS@2$-oS#qdI"U/Oϗ&NNY|0Btƚ4 D7spM6h2G{f:V1t؅֣=GRenqQjg5a &'7vD~`?oTHvMJAy:}}Oigna~ξ_ݥ\c]XᾑaxG߈05ÿI4})fU"KWnѢw?ꇗ}U8\/ Mdsc;'-6VbV!N.FKeHQQa*0c@2XzVz19"k_-o޴0BW]]o{(!~}VK X#_wߝc(sg5&G$3(q˵3Oi[#"yzIU6=/\,7_l/H;P+ C$g msMa[SprFFm#!6FKT4n6dFm3SV&'RbxZdrfqʆV0a/*Wa״ 'm[#/?B 1@V+eSq|)lŪgK޻YkF*6 rkb@o;|E:8 ȻX,-nܬJ/|={Wqt^|Unfכ2xsV-[y2^X Ѣ@ *i-(Z#Жb ɕUPx\ri,6qCUFR)r33ae28|&FAʟlB]!==늋d^Sbb|| s.? 3ɶb"]sFWXp{y?XKI.nSfDHYDcKL{Џ|77&ag(3_|*"-;L}k˴U){g [ca/=RK$w^˂B%'I8헯!Z_PBXK#%dA,/c=9!8W/B1-y&S"x/o  I݇ ^ Y|xwx]ߌգC\Szk&M9.A'JFwک ^EFZT;35y49k٢I.wVʠ1ZہA -h0 4WNֿM@ȗanr$-7x|~r  >>QAQ#r_N[BBixsLB>xΞ簺-O#zX}N2lxpcyDy-'sW-#Ux nY~QR'ǦC10-ڡ {=N+;⻀=T+䌡of3k"H-;G+BI7e{ |#E%ب[@Z`qpcN>XAC2Wjjf{ݜUcke}t z#7ݭwcuJD?WN$HKc|owl~z>vLYi/4E@ dNV5FpI5%>FEw_[a$hQC`֔Z frlB:[%鬪rVXk]6pV`2ÇCB(xYb'I1&žFwV 30Q:@N7:+#7)0a.wEsDwu 4Uzgϑ#J[{r8b[pAiKF % i9.}LxfzJ[w2+4kAUh'EM4ڜ@> ǻزnpQ`gq=l~q=P-GG*QQE> NdgƓ3?̬?sx8]_',̐IM=xZ8j 2y!:#1ܗ|s0p/TQBMD Nq#mR PJؙ(Q'VZ9s~WG-#qx ϽAD)GJN."јp#z!Bn BIz.PLt42RRvF @ϭK(<:&R2ɽDTVn+u>ۿMǓEp_tx'K,!N!?XQ BR%r Vms\]LغwwXÞ](]j[c@Uf(;=PQƞ( W13șE:#(H N.=&($d%IvU/ |P " ,P㦘."9d=2ʲgsӖ ۆ1ʆ iUkks&35[~~Z r[zgv˽S;(N5 (mDz ֭}\,a=얌u߬f(;pdphRwmQ5~{E(w̘X٢f4t~k(wгnhd yjWo ?1-[CB ̼ HkšE>,U.iR$9B } i8k2! 18A@ȞR O]+æKreHo'  5O.!@`(@?,wuE;š,ʑ`\>ٓ! n@UR kp8;vLch~?Sſd-O~aFVZJEG sCk D' c2߷_[.sWX KSwF֓]2:Ne0T?ga#u>|t߸װGw^%}y_.o9R0O`w,]1[ͧlH,/'$ބ-˭q:C gHij)T+͸Jb,' ʹecnen ~d˕}$k;0h{ߠ/&0ݖA3 aP^i4Zn2 #|[ss4pѓ|~>Q6SM#>,cÚ!8@ܩokS4=!F8]uv,7\$*eݐB rlh8ڪDaP#T$%`1Aa`MJTJHq*u r~X/ο@1,H wRġ^&o-Ĵ޷lâ6 cY >y]O[M42؞;Gq.r+LI#ibirpRե 0cq١bglDNFs{+~ V{V\7L/[`V@`7y~8XsG3ܻcO׭^+xl R y[\X5)&*Q4]Jk4ʍj5/NB~|?ȗ!?3|C>Ml_[|I aBD>I*,YP$c ^s = ٶXB\oy3P 4! 'T 9 X̍u &x$yP]<`Sv |6:1ul^>/s @},Z{,ZcĢAw =2.Zf\,_5w$HDR$ҽw\եмR:$7.#h0Xxb*@&uK9E:ls&$5I=1ɘ'iO8`ܼ6WgwG/fNn3ђpl-BGj{njT=ܸ746` >WÍwx!G{kE|эhC&4̊%DKLem^_К~IDC DSOXG~{VVeVmt2L]+ql}(_E>C*plWbTeCIM2 ̀֎~d:-4i<  ם$/-m̓Z4I[K 4/~F5k FܴYJidB31s8b4I&ʠԦJOh"JͤrTz;hX5ǹϰgRSZ (kIX8M-NJSD|JlnQvK1^+ޭa}ͧ|{g&RnI?a%EU3gK@%!}SyL LyL 1A!ԟT S1~?NSv`?$DwjHne9*SqՖ=5SEHCس꫖X+KZd FSBM. &0$:ӄcQ ֌pvV-NJ'Jlz;O}pl ve`<]|E9?߸ U*xhաW˳p}dzQyx̡k47 av*|([-7ipVjg5/KI7G(WnE=hLqy_cǺ' `byPDtbQƺL1jϺEZԺ!!_FT.%1ɮu[,N3XƙjͺEZԺ!!_F?رnNn<($:(eNw|hB [$+(xƹkupwABr֭w!M)˃"2﷤n-Т֭ E aяc|1[|$i6yPLY'MsA.&J0to˾[ϳ_7!LgwCuAէ>l'׫.uj}D3$i.:ЂCBSn= )$etz =(귞m=Lpd[Dpx "&´s5G XRpzG8MD )>΃&K g&$k!I(T7B#6!yE;cfW~2Ḃ;]MVRJqUJqF&O^JRT׎_|HIf~H9I!LT":.$M*Jq1&,^E0Y>~nuMa>׵4[|Rp4Z5_`7G`G.Ԇi\J&%yҤd:i}ҤҤb&aB0ݦ6}s)L6aӵ+W9*F9"ވ~.hγ @01 ЏѬ'p, Ӭ6-V@LpĜ uɮ[^tR)`*ubGu2~\pyV{Ju&qexeLÔMÔ^[M[ЋCoO{73W=2/"|@vr ql;+HS1">RU4ZzJ>Fp>:cEH3,#Z 4c_(U({7nC9&cuR$oJv_Ae.8*KiQJy^s1LD*V3@+:͛:1%~ 'D3Ւeh[4`pCDthD4Y̦R*p4{@˒9o 1x y#"DR/#>*:_fF/l&>Q0&Gy& sy-҇Л(yP|VP-ŧ_:aOk'QtQ MDT8”`6Sr4`[˦F}v9EƦ!ޚ(U2  K%>RِfaG| T1xFGqjDN]T5%}tDz"Ĩj%ۉ~6ćſҟ.S5r.^.cF3#^yڈDϔjY^WSAGШ*.0Y"jA\,fz;'$ v?юQ RԜۓ|JR*. PWю?6߾*8YLMOֆ7%iܯiջ?}\c: 𹥪A~I3Hta$TRdM>]%g W˴.*s]~=(&{R~WX,Z+z{k1H)$pHk7m)n40*a5+mZP^ih/-^>d_DkeP=##&?%3$ᱚZƿN?W}XwT\^g+Rʶ; 9(Ώ B4R}XIVQa?[M0PK^,^ YJt`TcW;?ۑf}lW?.(QXҐBS1 !UbijZ2 ΖZ.8\ocJ]'׋!հj#Z ?e&h7SŢ ΋(6~ps$rBNhoX>ΧC?,ӓ*K,JTt e2G[_86" ej2{]Ǵ[ZO+=;o(> JP& H(ZeqۋK0 !$sϋaTV)nhahoƓ*m֎Nδvh4_;$W$Cadal( 2x\ț8/k 6LXt2|0Ppѐ-_pƧӋ@]rsFQ_Z@$];b\WÛvMU5|Nq pUoR8K%InjfX J9> Z)"L]TEaH-iZ*Rib}d ir_S់mwrU#5~ f#[O?qQtTףZuu\Zuu\kuR"Uqyݲ>:&FKP:'"LMZiGۇoid0)a#m lIS馋#*^XgcB^4e6 hh5ɖ.:ٶVDẃo[2.Ym,}:\B:n>XphFQ^I/C/x))wRzBLN[YFIMvPEJ=vQ[yf)ϜS<)jYԔh6zS:K00%G`#vġ\(Vɳ-k'OL_͹+ @eZq(ѵ 9, WD%NG ΢i%-Cģ?D'=a#AHsnmDRm۸h+!ڐڽ*W*s%0d y H*bAhx?QTmEi,jv(zh'ю#Y~:s\-"d8Ek(p6r)P.Fm~7[η&rit[Eu4"0qˉcQL]zm4;-!RZ%c49a$D* n.5`?ә#\@8ܨ7\tr5W# (0/MZ+sؤWl/lFS4:5FukaSՒnǞ-Ng;EsڼO&']V{]V׫RS;MFWY&t;c1)K$sv'/$O[YT|5ǂeY{ NtE M82ts.99 vȝLѰŧuWhpb֬S-x |[ ^l R'ϥ\zy09*ӀTIM:3U{ʬjg?f"c08v>Ky[#cs3s@|7Ob<6_9^Ŀgh*2 ]*˞&tu9<gE}W Z"ʵ~l8Û̹UРip}bgot|&y:c ZSS;5r"YOnr5p[~I>AϧݤQ2{YDڭZnYa+K' Іkjp jB9mt'(α|؀jn<t'q~.%!qZ-tC)MT~{u⯆Ǥ5?rgu'C0z}3#Sϩ>{jK۫Ns~~چ%CdyƜHۨc]i'$3¾>/|~:ņNL4[b]] 4m'@b}kl@'?+cvO6*yVRILՏ7~.7O'qK{^:-sYõ+֝2(!{XKG\p87~^:{^mmRIG*f& \l4M|&th{(s=0kf"HO/Z~A~r0-6f v=ycɫi |N=Orgft: t مixm3 hL!F{Lt0/FCw:2uXlͦIү s-?Ʒ޷|ۼL$Hv2&s>LeF+& @Fy'"M'箾I]/>#I%V37_Z%p{jZ|?"=|@kgW* ϳ@ mŽ&7]^&>SII >ZO̦8G^agslTs8㫠Djy=" BeSN"R mLm4%F?w܀|PzT3ԻmLȳ`!CW}=1KA>r6s|󃐍w@Qr %Bt*U$hK%pcE SJu.ìMDȹz$LS8ՠM皗e=ZEzm:%ުЦ3C3ڡ6[1RTMxPq{$EZo%3nQ Vnwo6%a>wͣlIKuZT,Ϊ0 ^(V |!=`S hcxO1 Q&4DZL1R^|ާ;\gZVBvMFvG,gd@R/x0;)eRhD`@9l R01(oD~F$FzydK-U6">>fA$o0-1M*U O_Qqg o0'M7`^ewËdR Q:}D} %ص>.' "Xx *"5\RLLr*@uEx'mTta$b t+] zWE\p,zb=ReojPJwPr ;V@N;O 3:h+ߩ;@TlF+ =p%of9 h0D}iyr#n̼w='|)Xĝf0#]՟0C#FFd8rL>kU\^2%UQw]`EbB1O2ڨKy|Gyj?ffZćP1!:_zZff{V3  xIv~` fS֑ 0 V= zav\kGj` x&[#*#!!"lCmx+&t{O+k`z9X%YIxB!x& Q E\` qLEBoSzx}KIdoC ⤿"G77VʌM)Q4U9Jը$P8wGM+).W稾nƕWZMUAO]n,)j5WQZQo~>+4K%}9h=_H-8]l5si8aBy;۾EBIC!6vg'0+Js \7%Lh䖊aԀ *}6nKɑ]5 Ζa4Ƈ>1]fRqT\/_ LĖӷ hB>z%;LcL}0<־Ŷ92QpĩԛvDs[BBԋ*3O Q})Ca3驥Lk`:ƒ\PS.RAfnNu&﯎>fbw$dhU2|XH!]􌯭zZK shlIz\_$jph@V^e_WphpTbKE'pm'@SVɺlF, X}(y]*FTy*Ql?OMQg'uDv-i55Iџ L?'k-=mY]r/c Ov}ך#_t qnL_`WKvW W"Wl4R׵{\ɭco`.?]_.\Htvͩޗۈj[`/#G6so AoNhUL7Yr{c3iVG&7RUSE5QqRyȄ5אC6wm;dH]pucrI `ZTmOվ~ [1Ԯ=B=]0 ?L\,d88n8hodT2a$ڀq*$IlkTLlHzi/=~,ғD傡20?Uj@SɂGkI0.MGc=]*c{ \xujXnf΂G)/K]swö\O0UX\Gy3d%E;w>XiZ{dTޘ ^0Enٙ@ͭd;ܻxoa[ ]Fc fd1+q]~L_֋p!?=LT3jm% Ŵ3mx\s{B:] <9 ͔P(gʳH>3}3n%;(.ݑ2tQ9F7o+X_PslTykEcv?\dŔtwbVw{Bq|plĀ\ת饻( jӯw$4@M˝:if1iɁ a<{yjA u-6`ߥu3~BVa$W4Y꽙.4 fz?f/ӭؘYBwdA]X>{Se~&v(t}JULg{L4__f _=*9+t w+ʬDK7W:Q]Z9w;Ћх`s+p^}h) 'x*Ԅ˃P,$hD/ ;uuc/݃("4Fk%e SҺ&T< o'`} 0<ƚ u]-Exm5bHb`2`*AQb&bIT! FM26xYEBCbhcP#bcp"$ ϭQJğ>V+&د`KG;aJi-X d<5~j;~5@z= t(r-Bh@)XD D1 B6*B3EnL 7Sο%A9;" Rr"8븦tƔcj9Olf O9>۟ȃ-ϾQ8Ѿ[3Ż[52½=^4{֒Yܰv`ܞJ[V_×*> (k@LvV@–QUh7:!ZZ2^p:>``aXCJ0F 5B#A:HV#WE>6]ޖexp)BwLb#Oq'V#M #{ b(DHRⒼ:EaAǚ}yv0*Uv6/,¿N|WnnJ.akeh 7! A`i}FZ؍NS /a[ĜVQ*NfX%B&Q. 螖&з)Wg;ƄOj2'Gd7RRM iبILm%4 T(u ,xI\SM"!e% U ^h"tHaQ YcPA- kKd{jBha~@S5){Jv1S1k(Sc(24M3 Z qQ8TS Ss)Fy!>]k Ih e @Y7O2$Ū$0Y+PDHd(4:,6$f"BӞ0E8j)f Sd񹉶pLr~֗O<XΦKFI |9aZj`.L, bgQQɚ4H&2IQ,$4Ȇ`E,9l8 MXTh2RJ7tiV8N brl.d(9c%7c88u<'QZ@A,18,IL߳ޤ,Ne7ݩViN֗$m'i_ZERv6+`PJ Qm.)e;YQ֒Ĥ`+ O!LDۑ /ѓܣ^aEwmqH4ζŀ'X` \^&x:%Me'oe7O$1b>}"Y,VKZH&SaD* Esk7}*'`*Q.p^XˆU[Ϭr(L;1sjƢ{soEof*fUV%x/}خ "ZՔ^_8P,Q1jDjgwmEǕŴpG):=`^qG>gh̗|*cK&-mIqQ+hL9#X`0 L Lc``tk$ҋ"5F=ϣ>]/}ss7vT10\bl+ he0A;oL6HtWm` .*ffMQTalrZsʠP NN<欑ܡ!Q eɐPJy¬wIK@(de՗PA H?hI0l`l[ŀ5-%ˇ뽇89C3i򯏲YPKtZ;sQ= j^ 9ѽA9i; yF楐 i-fA aʴ;hsW6y)9n l1Zxcœekam\,\ qu9{{5XQA{ z[`~p˚j1iOp Rư#J])[mO_]}ؒ$]o[tof"pk//]}䇋۷5dfsq -KDjf;\xla f|b*le5'[+5[wki1/ncV:k%R?6u>¬tާQiQ"*ǵ3zB;Uލn è1z;U;1 ̲]}"ߍAՕt1 lhKvz%W8K0![⮥u]w^Dm,N;ԠRG*T#P=%P8 ?FdT!{%2W=j^]HC >\b0SFz;7y'Q]53B-5>2(|rryiռ fqȢ*u(2^ͺA{C:D#̆. ȱaV]Bn4-yR PZ^{ 2u%8\o/u Uxg%NR?z)8@{Cdyr4c`XwVUYߍO#FR֍S~Ό n@.=m&1V<:AޙBk0?$6 w~Ȯ*#bn(LV^6{O.th`<|fLr/no^D3v仰ϞatJo=Y9Wqj/Jg{aa)wxM(+yqq[ps{:-h&C1 $ s5#tN\DŽx"(ZJ/K< m.~Q'!Vm;9OhAxkk_> m>rM!1r +Z)[s3SKfi|}+)pF&+Ӫu4F[-W{z]]5 BA"_u8+jAɆ^jV|h~S] t׷g}rH[!IFͼ[E)dQO\^ g3/MMإ1Z<5+!EV RGJ"R"+Cyx;Y.E(189^]Jt%xQҎҕD*6AUsJy*G.(Ƌ ZOڻZ ֥rCx|P9Tk̲N; bG~EإV#)l;Eإ+98%mN!(isQܒg4YI2"5|6isT89N4jcI0BV6:j[$qgv) m)D!W;BJj *3={DF#p+u"L/*v'tu-M`2Olh&9 X,eCT77ff* ך*uXA+ոbhyKEw6'.YLw?\^-V1]/}ssW*]pX=+a%nӾ_;)l励пM,?]\NQMv~4#pzs1sjo@??Y2gL%3}V5*NGiψ `C1",7FZy9|z>Kq9)}ɛeB raq;OJm,,Cg|K;x*S$ԖX.fn9Gng~Ndnr&- yTBCPN rymcB$e)v\YPQcC9%h T],͛SYpb(?H0xx)%*B[)=qRN%WxʈƀES΃ X&qF֡*d<GfV0 @X2B k DpSegvq5u7sZ!WRϕR,b8#F$;-~7v&M:=8^6kwrl~ߖvDmJ~1/Z7N@0-qĦ&C:՛}^Z*gttt1ǎL>ZCŸ/YDռ1SD:֯Ѫ%ۉk€QIjFҁO1Y맰OEBUݜ>٥FH&u%VA4E3M8EUZtD{Qg1GO@5(EB:ޅ(-+)_ڭ[;&7967DkF1 D FE*1ʥ09}y1Wby> Bp^ ?gS,@\0S2F@caZϳi6VfӜM74.G~͢haB9y,re"%ēU+9_h5+P|cAUo KǀS+3`@u|Zv4 Ŕx؞'& rX/]DMkqa9Ϩ([^rXM bvUuuƹzVh@O>3Ɉ}nW QZɝ}?K,dϪz9HH8L"N)U=`K#DhO1»p]+)},jԠa]! XeYs (^ &ʄCD\qwqX" DIתNآ0`e(ad ),0L ,JEF",Fƶc01e#YYh~*NGFE$oe;]3=jmtWkIܘJNZz@_ 累\~AH@%|-EfZ*q0`pD2 LL(Nu0|R96yVv'8e,S>LR!.C GC$H"%ңzP\OK3f4Q24`@!*FsP#D+qӜЄY)yK0EZNmɑ81mc4- 8c"ZÞ 0d24BLITN+m$9blр cmOc?Jh{vEJ*E1u-n $EddDdDdIsܻ> $'ޖ7_rRG~^' |q2VBV@?~K/!C&wȺ{"u7W8/N'uI)5/Gw7gTD$*FSXqspr̹L'to8:F\ mpauCi WaiX_3 1U/%ZGr^'2T:^zfKiK- hԿsf㨅^[Qbri"4/j;ҡ3Y1ᢒ600'C}J3Gѣ@<.z'MDr8 saaB$xDᾩhJTb'E# SYS`rةtQ XsX<)K1 ,G;#% }l!Fm+m85Z1Iwp$f'փΕ?12`pGcm 1շٷ.vQtǤ5jt9m]1C#@ֻz#]`EtqچH-m-5V^{pkQ N`oBA3IuFyR,bhL4+ (paԋ6({gXDod" XA&rJyb #*b*0MѩQ݁I ET ?!h~S' cĜ14tLлT^5IG H R3 5 ldĶfLARJއD }K%]'yx , {^Xk+$3<Fqxj y [\J εHM IB)K#uc2) dWp1\#Z>E=OI}s$B9|f +}'0mŠq_`:Nn o_,Ⳍ?57WUԷ|lF)p[S&7LyVˇ|9r9FNI3]~Ott'eqP~)vHT,viz"=ûNTN/_\=P#ĚLi$#Q"]z)];b5]pUə!XË{[qUaTL*DM؈U [Qf)olgWkv2h3MSg TɈH1SQh,2l'q~G>ܮ^@eIJ6 rjQYl,]Kk/c2ZɐsXJ9x^4^NlJviMem͘)>}$U)8%N#=tpAm+{[eGu;sy,߮-ҩ(E!o<&Ttuv gsd`U֫ XR ={~*) '_CŐsHi:"(I\V:[ CF[Lm|d=V f6#y(j&tZTZT :^/1C8u͐K{>a"EZbe eL}%چERvB8z̾Xӝ}3 ߹-CfxĘK`+ͱT91ԑRq7Fxt8tVqg,wΛm"RΌac  ,6xbN=4`Y f%ƽsh*ND`P@~>(UVNfapa%ZQ&D3YqLͅP]a쨊SXO+9hW{q)h,vCw8"$G,4i}Ҵ LE {tSWOS[[Zz-]˗e6`ej֥GvF +R \:|t#E7Nv%ϗ7/mNMJ3ew݂FIswew>q&Jσծ06y fd}gib+C(J1}#N qcF=iViZOS9N:rDCМ*ǿ[nyCs\Km,n_ }XxYCDo=^RmA<[wџaޡѧvr~*WGωV8༫U:ӹF.=#+gt^}>v 1[[lRzQzK24eFf['A?prc) WRSU%ak-)j-i A\xĖznv!ùeBXǙ3EK`g0p)OpJ!;'S ^_%\K5&g:-zJZ-WQFTQ*Ѐ))67W~H.D7ΖLvƅl5M-+;/1H;Yֶ)A;֔ KYXX$bZZ4A=Q'ŴK!,-W`5#:]]Qjw-=kQJ[mXVXN+HBEtRTjK~lݚǿ| 1H+{>EOIu0?'s(&W3N ,zqÙ+W̪ۗ \zz/8邮7rM3Ŕ$ڹ걐܃) !z6Ffqqm Ai5Q1 [d".o@{J^H#CQR&b/\2X'r7syGjjc!MnNS!iuka-:-Q[Z8ʱډ9L1lZNn QK_iá[Bvyg [BXljb %5K!"xfba$q OLhŕ1X ln <5`cD)8 ϨZ,ti4AK' dKדԜ#S>87fn1/s S*)}~ܣVgu0zB$c Oy+$ plxC2~?j,3cL;P=V嬆l(g{ny+$8ߪSpF֑=*<{"H=RR'hon \\O1ny.O5njέ[^Ik#Tw^4biR0E-8@dx&B|>}?&bF$]avbQ⋾݋a4 "HY}L(#lVN …`ЁkEFb_f6ffUDKyfmdI%lVe%ռF 1(M]D V*e)*)Qpk-B6 NvfjATx.4{6YqevLY+ Ɗ9v@VJz` k\AЖ_@ w PH(O} '{08'A•D^-1\GWSYڜ ޫ^.A((Q t!;lY5SUZh=֧| Neў.97 !m]RDU hLLXb _n ^sWRgJ {'UCalJV?nK,E]Fd]y{ A鱜#%Ze%W46sR6X+ԝO53:2??,C3#;/n+uK ǤwN~!=Atck3E#| .Gd&DWzŗgP2@#uY/_rp6 0CSdԘL,AKX;!kp[h4! ݅3jC6'7{od}PZ3=B=0y0KZCsjHqDZ'BuLt{E |6{GT %(1@{ kuڪZpl->[< 1V`B8,|`8^O ouZ[aSDӱo14UJfQC&=Q=&]6QB5Sm :雛&% R];c!KOuqP# )4‚ GfR%IB~_E*~?/195 f;rX(b'`⮣&:ݿ'3!X[f5!p&Fa˂sx.Rx!pNDu؟_*^1^_]U:a0˻y̰.?/_y*-FٙsGD:I;#]iE18(rPLX4*$UmHe.K Ln 3x@`{{a(, `#O4D2ǜ'G)[[Ai ?ӽѵ߀X$4N 8Dd<90d~xmt :-AcնߑA;C+7=sW|>u=ڧ{Z{N9ۯ{*vI^FDa(x(P,ߣ{B( ̛՚*D=*^ED#[i w+*gY)?d{Df+̉5 3du{h]:0[]èMJg=C*ED0lC氥>3Ĉ5aN% JgPcz,Sx`yc\Zq'n;"3.z(6Y,*j4~#q3ﯗ6lo/oAi)ud:2tNJ7EsrC,j?irБi(nW[,M,:n>0Ufr=uT94{gJ8qavN`}Xͼ)qgNdcgVb.׻uծ^As9(;PRbdMW(*Ir[OJ;QR0ESʹA龂N]>@gN҆lU2;D_) ATK$ ~=C56yj\,`*gԒ %C{ ߦ%#|ukeeUקE s#NNNY ) i !RprBq̾'^7y>\A[X0z~lb!wG 0_,n(|~1'ȨA`QDpSX)5clS#gL<[XMH_BizŚzlb-ZCZ9$|{w[\N:\P т༠ C_"gm)h_8_26Iת#V jeP L$1:G,!KAia℁ʈLJV4m@ frĤnoz.x.gŐ GA rĉXAO!0xq#Z*p|<oh^Yr[lʎ.Q=ś_f~dkъ3_,>N>}[VšʷUuO^VfЕ者z?}@iǿo,bA PNBl5lMW k=Rgd}ޘ}(WkP)Tv Ŀk$uv~ԘtfLySdnM/s{Fka-)ݨkB%P ;P,Q[ *̛~0́:Kjp:!8b89T;GM~wP8NƏTNrXNUm:=AӰ4h!C$E{${Oo "| >^~i$s ;xK:5:;xQd㥖#5$;xaKiHnR34sr k;Mp =r2ۻݭ_.|^nmˋ:q=|}&2.W٫Ųɲb*Qdڏw3X+Zv#7uN u 1!3+_fet_;f㠒Zk{s<bǫffez`'g3/n&JW,٬옉ьw೰ɧeĩEQP:/ѩ#3Ib8T:Ͼg{'xpdw6{:<[Q:\0ߺo3.ƱzdUUCӂ#˺ p9ahcrk>GWBc#~ᢸN|w69& 7ײS1{R=BOŸ?Okm4wn-l!>T0^B)a / 6 /uڬx809ɜZ=yz`ꏝkQs0Oy!HˑsN-V9 }I W:[ç=+G.ճ[NqW'rNx$ύE(Xhj\Z8)-rVM3cO /5j[_ 1 ؗqlPΈgg"@ċ˞ċSd%dVWNձSdDrү8ߵz|VqnٸSDaj?U$ d4Jl6f6OMh`M/5ڮt'+uQy/gx=:eYǰjq1SgM{ԝ_0`lz^Bco?𫷗y`0 F @!D6!E\xPh} y6=!hpLe%A;DvŐ;H|ﲁ]BnGc]bSn(\ SXh~3X}NC BS-NaP)S¡y{O;; mO%_Eר-ʰrvyzA{x$rI]$ɻބ8 ;fPJdpcb!6s% -vHhY:o cθ՛1̓*IBbD u+ޕq$B˛> CR8X@"<AV4"u 53&Āx x誮jx`*bip&I5J @@JYA`T./%0K^Nza.iZ?{\ܥk&wBvnOWvjOvkC^8)YQ+{s^{koRAosst&d{Ԃɜ7Q:ߕe=9Ɯc Hs1f)9Ɯc̍Js1f9C9Ɯc̍J݋1s i1s _1f5'9Ɯc̍J.euv% c1&%):cD c9ܨ$3QXs17* Ř)8ǘsIIŘ)9ƜcJR݋1SI1s ^j1F%$v/̨ 9Ɯc̍Jղ{1f&1cnTc%)c̜|GL17* hܽ38c1f%S޽3s17* Ř<ǘsQIJw/,9ƜcMJ Rv/,9Ɯc̍J£oU}P9Ɯc KV{1f1cnN8TCk`5:;?@Df>nX0"6& Tibd6/q1i40}a<"dkxprٹZXQRAl$5 QY˷IDu<}\1 ά^QJ~j00|NaQ.̴ mfr1t L XڔKxXLg{JvltY]t <8tfΫ:L JK%EUV]Q:qɧ0N/W.>x{s!(3T5:3_,y4hSZl?F 0b #W egcg lfI o|Kr%= -x೑0Mju_t:B;jLR%VlZ2 6AojD(WXm7 ؙӫlc!5lڭL%nz{m9SÜ~l-5~7쒉`h}/7e%i+9lV0ߔ}?ΧmU}t ,CmŰ`BWƻ T|L{IqZ $H5zn6e)C-n-l'\en5wn`붻LAT{iAtf ,iWA!5n[pܖ4,2ZspWi-%0\66*|/脫6lyR سvI[g[n bH!'WskƇq,H N+`̻Bs- %!,r\H'^v2uQ34}@l*qg*2Ff3plV=P ;hOyT` 5 }/^̇HWsVو 1;"78x `J" kBXZ)0!+.6L/"V1@A`'5PBɻad `@9RQ崧H6Q .XMKN; !FEP#ISA XO$YVp6ôA3A)Y q$4v{ADihuZQ !c=)xt8Qׂ z&VV ڑa:K;%xpӒ|ZY0-F`^XOdD( -ENZ"Y MKCUP+8_jj1 X`./dR"rT!e|[y@ oX $h@ c/`i8a 5%,Jb1e % Ya`0փmϪŻ9A}=OhGi򝝅1\?UX1X]6s(C"]N{19>_z`l @eܜ "@(R8/oLg`boFf6\f쟋OelLqMuo*rW0n1#$R-ptͳ}ŝ($~#U$S|'`"`GHhPfaDn,u;;xsԌl\ɖjjimd3zsx,᫙~/muxeg՛&tڻSbcYo*{ޒ7,M;*-7~V ҀvPT$hECQYqr]%v*SUn&6HatK3mA# *8I[GT1t\PƔlb_쟃FG/n_HΙ:?G FpE2z NZlX 483B͘NSNj GB(uzޛyEW)Ma8r'? Hp񩋟/ư5CT>|v\2J*~ at2|7_LwP(Cg|遰:DiHY("XiŪtPrӪt~knڳ2 ,#hFa^6E\HܭϑOV}d$} ad_%u2`D -*%wT { ͚tOԓHRQy2r[?uL+8 ,HbNY.6.$ I$ 'ea(:^+1^3%'Z;-V؈dx^X h8G(ʧvTY\7S?F ߌTѢҚ4!4B0Z=G) RHDn|L[rv|]9 󢍜Dhrm.$nHWg슡n.쩘+$QǜB9M qKMDZ EfA{m'$/jclst ?yƐo5'N0#= 4U,Ӄq"Jk f ŋaژo2hqVne Ztw/(1[7?:ȩE-ȥn w#wMg*E?ϩ>70he|^g 0]u<I֔ZcZq_!ZofWz?ՇwLC-DR)m"5O>^0E]wpOH [\wθ_Գմxνĭ{F$ˮ0bȫZBS=")f5t V7Ȉ82.6΃flcHbq|½|RjZ/G+ A>篎9(/)tKiK *lPl BcsD"h 8UC5@Jge[a"zvJ,UˠD ;l0" (?)7{0Qz>۱Fyƈ:KVRƙ\\*-6[DsJp ,ZYKNNnVm^x$(]&''2OQ)m'I"H0E$%acW`˩E"xj,udZ&hmH#MV, 4%nCXoi^ͤ(=(Rf܌էױ4Ѹ'}857cF'(Lt>aiNG =h-FH$ R؇ݚPy#@s%\S ;eT lhQPMJ`0he6\4=E) zCcٝjBiQ׀Z;#bEP'ԛhআ].,yEQKi?]LTw7 ̄.XǃbqRfSP2k)88oKYv_s+wHX`Lx|S RɔC*S:eG8KKՆE )䨳UfNIJPWL!yB̘:0X0W,$g?*3fmQ\c6PpRɇ\yNN%cfI%R*l/ 9c,Tδ(*(b qeC A-P)៥=jܺQ\x$j\!]:9&r|g .:JG` ˁa]˩F➐bRa6kMR0ca)n2k&+IfU9NxĹjl?E&dD\8i=i7kM3@|{Q#X SNIo?É+<,q<< f˧'E-O71&rQR]0pttOOAGV^lE',噝/-,ey zWa[ZrvA@MT-ʾEd$( Zd6`Y0KTk` r5 (3<i*H_prG4GѬ+87)٩rsfV0Z\ Q &G D-C[_c?-@w:ŝgUJ3[bq.9.L<.ߑ?-tEV=q̟f3%u ŇOm8FQ01bL '콶EA?!`z>?i H?3O2cX<у!SԗC`1d;rǽG6 l%߱؍D{?h+@"}J}_Yj١2>~\ 167֨DHG+y-",ݮ,F e^3,3hpTNO,E4Yz*J)+'ݣw"9JVz'r>/=uk+{BeNX2)mQ:]JCD${h$1{,=4tx<わ l KU&%畣'`AZPl$6מQې_\VF@S{H9D9;MtΩH{H$=pS3ׂ ¼T݃Y.`9DRQob rڭ}Sg#p(Pm6a}[pOAsbGp6wYU"%YEA%Hr: +L-j+kbcJ$(铙OT_n_ԩF`ǁ>,[.ワz]n"u`W*g{g?s%S,gSB^j$wtVmx#p& *7OQM 0 M0;~yf "?5x4l+M|r=q|OL}(٠*Nxy?wx7Jj5^I?7o{~&B/q.<ϋx`6H.sEjȚGw; `/[)?,Tgsά dV&zЧ/%ڕSĚ2sBDkXBK* 佛8'q`s!Ѷyfbf*U9(܇j-(o3k\Ty@A0` /j~|5; q|a'r'Y eufPKc Μg3`B 9aT&P;[K&EB{`)"s5\mbgW,4AهxnLWED/9<4Bv8dcxqUxsC1-ǐ{yFqJ԰J *?UFH}x#ٰ?>K$Y|m7$)LBJ.$ꝣO('kmE{ 5]_C1 -᷈jrh Th( u"rE)OB,f[˰(dwE:?^x yl|ܒV(L=j'ܓpXOk<4lqؒ>uAy櫇zp#irw=A8ߍY_NݐZm)ۻu+dcIdH*~34bfH̃#QFw辿|ma]8Qaz+>4d `kufm{q~sp&gzxf/uO3o;v9Ȑږq~-N{8ߍ$D*ülʮ' S\]gloᣟi<' (za$<*Sp:S: dk6OwnwO93ȕf0XM=e8=|2xmaYY 9"#Q~·Odo +>vsքrBy?IDzLIsu6ʺ9!vo!jvu6#t$S2BF?ޔpR|:F9@Ы|G|q>1ְCck7g6CT*EMI07 U4UOJ޻E2! Eh”hIۃvXҚg)E7ƗS@Z |9i85Pɰ'Pg`爖L{ͷ?6>S9S>$ubTJҼpD׷0Mfӛ.YjخRԍD=E`7oatgfbz`m˱w0n5dGQmY{I9<#~>&IЧg6}U*`2jpOo? PZ&{CY j𶸤3swOw;J,~|/`꼒>UQ@ S)jPfk&ҵl`K_ `a0C\?DÛ`<_sbO]̿z&KF'_TQP:|]n4wqGs@JX>0сAJVk";+f1}+ % ̈np{Ƹ@ |W `ׁeKVvII:j{QŧY:m^ng(BIÝQ.c9o =EUix'WpEƿD$bP un$*Pňl@(,QX FPm#h!qJ|&s Գ:,8+Kn:PR7#CĥH6r\4azFF O2)2k`M[#r%3a(s:6+pQ  ^mrٯ_9+<#svKuځ\ Ltg.1NJptvej̔ )0Pp<8s[s"Ox?6"40@)Tnu8ug+~*iDaݍc76GSyN#e/POʎ l?[ _炶vWE:#a_ Á5DCs*G֑!E+3W\K=wR{ \((r6;&9dXgX9bˍ˽v"wJaW B{&İ)a",ۧ7벀/'$pJ a6I`YLfX(πbK)FzeBE8Sאqܖ1cwHY$CXTCC4×k4oL Nr^z԰hM-2OyN&| W:Wɿ{9?s P,XRxO)Aǯ. 6)5.J?[mc4rɔAl=Hef=u P9' $/ˌaCnI{ 2J* A:a: չOJ ֆʦ˒˧{C8)=d(4-1ŏK޾Yϋcx *_;ӺrBD T]z݇;/p{(o}gyb]nϊ!W ([3_zj>2NLI:5DTĊ~ow@Ԯ$zm4(W #f&JEʜ*)M+yn?jJ̷2G~bHϕa1[ϓxHY1 y ^V%, (*kBv1m+$N)$Wt|vz߂QZR-C70u%l ꩨа["m6OZcʟN7f^uUM&Af,-bMjݚ?:{3|(Ӌ()gu7R7+ծ7WNai$;fʽ M`Nw7DQD6R%Y@]rpFbX~u /[Hƙͧ[Ou P{  ??uQu[t/eS;\nA)kJj^6swVhq0Z.fB vuikB9kHAFLRP>ڥY7&r*s_)U?SU/"֟x?V-~>̃:-$td+*|ŀ:|'˞rIF@LV;&wjo,'Ǥw>^X'ipd+h"G^|*3񥒌C=h2`VJ<4[YC ','HfԂrd30dP4at_WjVSKvM}mmX$1{nYZ)law\TtKM ti,r&Zb?AQZ*>8&P.rԯ ; chB^+.o'+r9pt[)3-N[٥X GLTQXFi2JJg2QA0.7 Ss`Cp ~Lhŕ Ɨ!A66u=5Y\wWhQRX#܆ 26j[C(J%uX+|&r-fdʉ1s4g(7;( tRsչy(AirRL1_/E׈](Dl+ɂ&Ad;+Xc Qc5#FQFЗ_'Аh[I !ۯ LUn)/ow,7b w/u.IC %-`Dpf4@ޖ$Rx_GCKCt"HeIf`K/vMI\v;)Ryat3Uz`O_ghQE-N cbY87o~|OD&XK"t}ж3Β\3w-BfuEӣļwֳݶi}rn'j1lֲ.^fpZk,Tqv*X3SCQ?b>Cu"TUnxث|swO^5->dթq?s¾K4:u$T:_hJgGHc|Ggch轩w-<+Suuv'=<]z(˓$ $mv)?vKA褎QGLaiڭ E4I<{s[IA FtRǨNC-Xք|"!S 5ںI9*cBω No 0Mfӛ1Z> G5fqsݿ̮nj9U'DE⯖coJPd7T1|w#W<܏*duPhW[B$Pq\[ΉK%c԰@a'a$$Hhw4HǕ7mONԘkO!HբۘחVYħ Ǒ#h[K,f`#L xl -Xe@ec%1fJY ҔwKQHL}H,GD7-%C!2}Y&PV˚8HG-..6ke_T4+E?ơHZK0L ztI2I1Aޤ2aZ"ΦJ4GA4GA4GA4GUѬ&{J88JKNa3ə mBLPd,2v@IESߐ)rH1E6r!Fl۪Lc53jk|i<3ayfD[J)w(Fgn3b7#I/^/Zy x"OmI7IKZC}#jA>"+#zb4`HabMt-.2aTzaRהA +c4DQSZK%3; 0 Ҹ7xJxZ+h_Gv(F&u}mpG3R44{bYJ'Fm"k3H1 Ժ'Eh@'5{jSxml{6Wٍ&:WQcULl={j{ֆUcíWZG]46lo ƻkp91xWYrs]֥u<CK\uGn!nX#h=ٴ C-eBCL+.N(oNB\lشgc9/.9[:%J)9,JѦSk҅|"Z"Su'Yn}nN3hY{nuuڭ EH4{sQͯwJ1gn;"|{]@ˏBdj.$ybnmO^^>y9#2us.+Ϭh@,mo8W?>ŐS-QJ 6*{K^;*maI,قjWlyYKp l>,idwRV.lɒFi뤃)h؞k51txr( X MNIe[[}xf[Y7oTҡ!ܼ3^77P9!eq7֤Bă12m*ʩ?}u:߅-PRD,kҗM,Kq&i2c hQa:?VGl"8QNQ@3fH҆#[W j?Es l*2ĄGf8`G5ϘUg,xjQ5'PlkZkQ\jTikiXĕQWQB6jnE~{+9#>ϕBm{xDɰt?ٌi!{Fpym|w, Rcu[g`3zf1&M>g[c 0`6͞y͞y37;j-Aua'qQr\h|iEDF gΉџk/k~R'Ӌux׮$anzEX4d 8_E.rPFnA˜azLҼ](ɍo+F0[ ɁsJclM$8.x1ha\waT}5X:Spj' o+ꚷyk '6qn<Mr-!e"(eX{H-n4e2nTA*`B1^s-A4`(AvD'Fp 9/#PVZθ#S I0! Ua$#E$B1 ATG]1˩BjTKI%LS$p9g61(`&"̔'̞A,H`FuoPBok9O]"GyCik%OL V5|OfP9'3rvi?v lrsutـEf+ݔ//s {o;";S~UeL15R-NbS|qyL\Md%?#ڛ-0u~Rm-E5l8 em=A9{־QL(]ř#N56Z[@9q{% fbFae0CX)Szb-8uN #$3Iy`BF#f,3ks3pxt9G)#BxX8 !ErSIBMFh՜s9Qk^?HϐV^ ]5űye7环Ntr>^|ӖR", &ť-ާRjwބ-ʿ4 *#r!j%,M1syCe<mOkDN45KE7qi|%`pn*gnDK{%]1K1d +kو7M6zoD"CS-8GX*q֪YiX/JFLGl@B 0K V" "㍏KQd$UrL53!I 1;=|y1E/,7$(d׀ZFuLQc3{B qvr ^Λh`9ZNjdRInG u((Ljey5f:z"N"Yf8.`h%q#6Z|${`*?o~_,sQX&l C*3BHc&#FRD HaMceKַ iM)=D'*΁r~)g5`,xPر3kb! %3˭v(b pؚnM-. BR.Gp9+!MDpeF^e`e+#5XKhS L6]9,4|mjIʏY:5?OӤj(3cbWVV2:ڊ|=W((9 fhuD *x75!,z (=Ft#cppd2ڏCosy\(( xӺNt0*ܣr=u~c}R{sw/o8*Ѻ~m*v߫b߾Z~ tO~Zvv+3zY󰻹?odN53?+°^-\2 g\=O^ Wjb ?Cvqtnџz_SV`}=ϝN@Ij6%Nӽ%-/'}V:{T7%'ɨ%nEiŸ'nA-I#g05_l*)5ϘO"-i&q:W2#6ERv@țNxm\%}JS'@gq'QwBxvE[qatS5Yg2`c~tŗ -5-}Wr5b$#WH~{Yq1xKf5~t.?{ ID0b~1%rRrĀGY c>Hb+bTq21E<E9Ȑ唜DR%-|bMHpLu9\?'<6jF)1vRU\)h\@ ;xsO3~u9~U^-*sjt_uPtK>+D_3w-mIh]M'Jds\xJ%J&aHQ CGt7ݩ׾ 4n<]Z-栭3 ('ycNT12w\]u֯({Zݭd1v(:r^gż^o7~1 '^wXj iYQni.B`tD "Kr +H AHy?yle<V$I)ItEx6@M$H) qu$]b` +XITN`w| @TN)Y6B.g`dn0w $ux>!s2{Wb<ИƼsƜo8ięxhT4  kɘThʀ!1Dq$ޅ+ӁQmDHp4xW ݪ㟀zc+8͍>x̘7 h<7ց%Vˏs'( Iںr$ޣę!xWƥT@g$+~p>ϷVj/>l:iƥψm&J13Sqv 2` AF!/ZƬ; +BV`79ݜM x>Lx|d'x=ϭX_;6Q& xq< 9bx$OʘԋSRс*?>p㋧Gv0_}\<䙤Kǭ7=/ΕO')JF"jhH.! ۪Ojm8Ȑ4J46ZZ)16`D-fR2.jWPԮt'^aPۻ6KX"RtP Y4Zb VQK;!-ԍV*cxWO)%|@,`t:}< '*M:V[][Co,N:O2R kg=J H4Wgvc0hu1Z&PDAvxTt5I,,G@2!dT? *ӊKJcOun,79j;j$HN 6asR%cƌTyLǛurrC|OP2푳MStC=,z_#ӣm רD)]2UGZ "6Ճwkmԑ93Izy_+W%LNFV/u,I5wڒ3G0"0ڙI_yRfд;yn%$ILɪ%n o o>-k^B@6}+s j(bH4Z[ࠑTP 6GQEmpuNǍ.䢲8;zUl@$ELIk2}@ EU)iZDSܗ/|3eTk]Wlq#!@uF S͑5fqoYœMļ,H˵u&G5g,_ &8!j m (,co6kl+i8g'D*)ŕTt[D4TKƢ? 3) tVx֣)\ZNqӓ㓵ۊ"335_(Yo4mm1j Y㇈[Y|ǻO=Yw,V`;h4QQl-o2`CܺZYƼpLgT; 2KG=+XmJF  QTrQ7&2\Jv,ޔT;s{)' `ET`jTuđv1 8MQ8z]#,[~>hSm6/)DŽ\/ Jm {HkOp'UE#ߨ<V \I; K_CXƂ` ^63 XEtsdx{DANwf0w6u$0 "a*7FI(xS OY5Z\P 7ZC :qGr"RTunF-VS XՆ^⯊NԖ>U>E^Y-q9IL5 ,iՌ=$NZYIrv,W)[ĊȗT).؁a!n:?a29A2:"Ő`,08tڛ5vݣזk1Qħ<UƼTW<%>c1sPNZ&< F5FaN nqh*-h$b6V[o_t$i^ny+2 3bB/kR{_'SrcAays]$#-Zm8Phcr+q\GoڟNAaOaGki#_ѽ+:ڬ6Lt3 ̍f"+(}OmhCݐ3R3,p8>Fe2ƕ*H{GxHڍU9lo6z FFgL"ϬPfdwq%⒫3!6V4DKؓl9?mzȹL 7T %jB5"E*?hfɈe< 9#5{?!HFLhjw$ɵ:.I[_Jt. "$IB7 c* o `q*G,Qh1& T"%gNƸ8KtsXu.m cYj]r 'Z[KÈ|AXrū[T| gd4X UTgD,Hp}K.|OhbjZZ 4$dYZ395!?l?c-Kð7# lihv5')\GL-u@XHԙLu&ʄiB l,2ms޸nA'<-PZLo')&ޑ.q&",31pԳ?R,z|NƘ|xonjjjҤ)v/CՅꉪWNG%au40u{.$:OƤ2km6!Y2[YPTM2]%? Un .\z3zrOu}APhKQ($ Mo a`) 3W4T5 9 OEU!΂uo-^brO4b[E@J, 3>q-KEoB*%Sq$X݉U D)Q,WZs`\K꽞 b[+ϴI%TZ[3]q %ɬGU(Ir~0,ȴZg|O! cnrm -%gV8 Qr3ٽ8 mʽ4h۳a̞ N w~[Bg6ucIM &mCTK7T:!Nm7`IԦ)Dt: 'e^ևDNsm=F!t+oUɰKo|4/Ҙ~( zk ' Kɹ"0GYG:YW%CE ,1#')$25@iΉ>hPBaQuJQu"%dU''n%a"D򳴈9Ti%ogJ`I%R?ng{2Np=gpf|ko6kS%%%Zׯ7fdfelл^SUtƢ'bFgbR .[x!*D%DX0^ iS%^G: ½^Ϫpq]p,Rq  1pH$Y[--^aAQTjl y"0U6hX bQh"kF+qRTh'.(pkh;c&xs7 M[ks@k Aa0kP*O-\g4B u@GS14 yw & I\ُas KZ~*|Fi>ne_GI̥Mb[jQFXl5KkT p{U`au0|6oWc+rf G"\ ,TQa=%r⠵yg=p%5T$ S #l%o.AAjqeptؿk3r,2 (aU %>UCaY=sK{CZ D!|x+CǍ$ӀIi#Gى)`q`;3Z 5>p L,2ҁ+R@+si@x$^Ձ*%^A;0Sh¼vPRG@ 0f\yi1Κ8OR fn:P%:xeOG;q vQ2#:%!P|+ cRoe0@Ftdaz~:P%:3@N`Fo Pi ";:@A lUV$j*pn EJ|"y^bn,gb@tX%,B;n'Hz ?OOζ/w6;n}M$)mpxw{ |rwkCn ~^va߿hzpu@o^t^mn0v8>r^Թmw8tG'%\ѯ{|[^k#~/PiLZ_}3HLOex@xv|0D(F*ҝüAҍ01WAe(՟h0}w Nz-3o??5`5|~ N­;'6jrAb.ɷGB} n%j=w(veJ:ڿL^ti1cSWrp^q1t]~s3O~yN:Ioz{mIY ?lc dtViO߃Rϰ4Ǟ Gt}7~p|?Bcw|]/eۃ\n Bpppc.p|t];S['!T> C2Q1oo|p~)|a{tk=ؖ6v'`4hr'Xw1ۧv> 4vJ_zgWl鍭Ǝ7QsP#xLmg~ξ<:z9_WTvnww3yC/4Pok,D+n䡮e F'Gcssh2Ca .~~O:rB&&n79ZvmN+7@oal *B [1p9(E;C߬GGn UTZ.+ ^ (xҿnʧj k)}Dž5Ci!1zBif$rB%#*R& rȂ 8CHXӁ95Y g WA*HX a$,/H( ɾ :*4 >U䙅B"af"(Ha5{ J`A,4C#Y^IuetxYxxLU$Ǒa1)rd (X㒒C-4[j<`"+T* 㚖S8*Nq~2n ԉąv L4"gÑ aR*Ɣb&+$ .F=|KpF\IiWZqxU@;xۖ1M0`UiMHi=5xl8emcip|7V۠Nli60vK[JrgWEɲMb16(rɝٹ<0yNXb< 0T`UC` U 0.>7P[& a+.̰ 6ap2\,"KC Vt?!i5C^FRRZP`L9HpI%Vhd LxI0T0 L`BV~2."q< $Dʓ)O G!fZ*p4SB8"R9.6)p.$d$u%qS|i(p4˔BKhy{7?|*yI}EWd{y+^NQ8׌L(I0wt&$k+ ^Z HNIzա#ƱPtGћA҃7 ew 9'fȠ2aZY쭹_D3w 4BX`r7 F(Fo[Uw)hPJF;WE;cG5}{{e}0,3!B~W!V[-%uW&`KLl-8I2M.12cS6*$iua-‹V*hWk i..@M^{BzaS'Ω o%<$< ~ٕXGɅ#NwKF$KBK$ҮHe>Y}^}-b PaՒ3&)8V᰻\8 G\%$Da- fp,CE+mߺq~fY0?FeY4ýZ_ Rxg $1E7Mww.K-[We)14=^}Rj&:! DX_uLQg$t&MBg$t&MIϲ<0"Z]y-;!H5^-2 s +PDM2"] ANJ 5_j֒# ibJOր%P 3-W2S+Jb#. -B ce4+D Κ[&0>KQ>qLy%BrR+rU p3,JVPʐb|r$J!_ /)S`H!.1<$_m"JdHd#`f cbإc nc@ f3m=sKKFA_aye ڳʨIvd'Ivd'&م%J/E^ y=Ptq QLX_G`=5PO4e Z"'EBr%`Y ) NW_)J26cMy*?,8 Y -%j- |i)t\Dd\BL;5h*2(`k=`p-!\ӫө>3׻ kbTkU2pf-R̖}==9m>E8ő=*[qب8UDW#|z _d Jֽ#k Of%knB̓,nF&:ɚvC=xmR“ Y(k w%LY 2d^|䭰44I K&!XT{=mh Y,Z\jzʹH,S"/W)5EBe]]RĄ{m]VRDeЗ[cIH2[O; )[2i.fDKF3bNim氠- c-T\®؄ZfjGREK 5' 'ZRzqc0 ;EFK kxFSf%1i2qBE+)#8$'0f[=XFK.5ޙn9m︫A6ڊNC4:kcGwWpUx]Z'XQtM+@랏?d/vnMi L `t;TP{]TA"`ሱcV!$m1(7gYsk N_[ي!s[Vˍʺ,0zeJ q@Sn|z;Cx "_aJC-b.T9#TMX#"гR*&y\J-gAC)njMM٨X|E@Kk] ~-[g btS2)ʰƀpTJ!DGr0`Y^TBgE2ZR"8O3nZ@i;8(^\|{9B9 .;;:? jҖi6}Fě]Mg Qēnnz7W!/FA _an 0g%}Yr iKP );a}=jl=kfxA 1J*{ieDDl՘bϊÐ>jSXp槕\\˳GW2984E{uJHZb]Z~Uo [n9łq >^!$}]j\!2_AI>Xc[pխpTerX)mRjpAPlV`SS)ttjfsKP#vE뜃99#NrhȈƨ09}<~U߶4~csj`A?v5GoXVkށlO|wϤ`Ƥ&~luL*tvnCbŝy6amo\&&ZpBuMb`>Q;0vWAoӏ༱v@G֧3k'6^5׽0g8F~m~~ >McaiĭW/Q/;@ƫч鮇(_kTmn>qb8/Q%'"߿>:麿&?=Nk8s߷Ip~Pb͑a^;es v 4ĹCt}HDJDKki}4D-_ǝCqR:!gǯ W΋aQq[ܪB &ɤMs\cA$Y*wB{dK熸5u(Ux!w'Г[=^)i7ޒ >WgBRC{b^g6S(ڣ?H>cr+Hh%y8Sa\%q2`3HKlלec|"4( ؅rN<8Dx*L2 ?$>:#dV˥((gQm*@.T*>ϮW*>_ A/ }2#o/IgDCOjŞԺI'SFҚ*ˌ,2dƊ1Z¤:NGb 2̟]q*t[ۗn?KBmn+t:t;sשf.Ql.(EuBS 3āO0ϩf?{Ǎ@/gs6=X2E| ݇E`j)Ŗdb)xԒz5d L7*V}_%>zK:@CԽتvPYjQ?IyNΰP"$y!Pd@ EDGX-O?<?[؊ַ>= 9pΌ<~$hWS"*+Q-J|J gx,;?t9jka+/6[/s"yM5ֶ[54!f0)g P FEp3٦/)=MnӃ=LٳURhm' ?)" ;xuPi2mc\Sg}5O!\>zH•~z}&ᗿl:\fwjx@┕ⅳ }ⰼ]~m\C(Z&eV花ڳT< eFVgK)*QpiEbS&./)qqz'ﶉmb5mSX(@8fwެER>챃U]d]ן0k z4ZŅݛս?P~jI? v|hÜYh E+4]60Lz:%*( =ZxSr-8˴o9Жm9Жm9qe҇]H.؍{9JA( v.#F 9*Q!~h_p!^_d=`}oSK 7g+ʥr7nT}bhW䵷jv*'@R XccrJ [c&d(**U!#@~$/6$mbr'o6w]ls6wq{%|q|fF&2Ɲ/:7^y]2u@:.u*PnA.3PBm/nOSåmϼWxyx63U~;?-`V7w>)ܨp-;ǂjG"J鏖~LFNjώ^MӳhRފt? jf<Ov _ﴭvc]%¸`L-*`Me"F*i*cМdFy'I;@s98z[{}ۻ KհWqBXԳS1'Ϟtg2_{z#aeɓ^yR>"$z??8M-<ͧ6~ 29޼ڝ"مVܺ;٣i"&9{?xb߽xY:9zy7 ْUv%H(Y1WS+ϡ TD*LjPؘtۊ/wۊmŶb[qq{Jx(osp.qtbtLbȚ/XG|j9k~ٓ'w]g|gZ53zJg."')(T\eFlJ#f/פr:WU,aA>C|NfmJz"MXݎ8D:1S՛7E9's|?cQ]?Q孳'ͼV45/~c;UZ@O'18#3?cv(|גdҩWr2)ӝI~V؞@3iγf:T3T"#fgNN;;ٽEPCR ML?Ϗ kkSOA"vm Q/яHUi[9-02`6]4Ar cQKmg!.F+ l*U}9P8cY+쨲iT!lǸ0MH~K\/^b$H0$^cA^oApFXq)b8D %yE%%>DBr@TqbE}2Jl>B#rnb( aXABEx`F3AZ%6Y!H aD'$X:g>{iVl&K6 mJ MS1d;zvOX4++V kGtb ڋQeêj:T]xT!*_(`tyjul2@p$\^!vc^luQ?$M&eh׎%Dܽt)At:d%ee6q(;h &*t144ũQ\51(WJE-4y`8hL:yYS˰nZ/ᴝ̤G!s+?\+:3%?5m.7>Ӫi4}Ɖ3\HQ&ViѐLo㿮 G"Na:\ptsxHMn*Ay>%UT g,ڜbŊKAlf 24ƟsAp-yy{gYYmvsR&l0m%%q$PI;:5v܎Mh+mi SDɼCƃsFihHQTe&ČAnФb;(jn <_x MA ?@^jl`x@!e 6V*i(%J lK "w-5 쭻Yٰȋ0橰@+YڡZ#]Y) 0W[I,@r筽6ڊ<ZH{Mkmeﳏ.uenEjN.l,Ds$RRv$PZP7qH–sHk|d.As :^P7fYR6k1]ZmLX3]Cim]q D mK%+W{WP A(A~[3J†k{qytf#ދC)׿)V$t/1-YN t0b@pTȑ}2ÆXN $f.RzY4`m ēgosQ\ JBG8y5=b\mMXήU\^*=إ=u*q7ZP0XUjȥ$*mB:p^@2)r",'ABVCQFCFA^HMw7lfݬ{1S-k(MBÍ/z&ڞ \[50XѴ [_C>$0B2 ܾFYA[cRp+^jkP.A ;ElC -bKxE&"6(߷-!gV1Lb^φq',ng5Le]0EN&A8kHzP%XuxOrzM']*tmE+c;9%A:*xOAC[iqL ݁}֠(+Vf'wג2F4{G{醜Z·]szbiWs+1q$I3@I>lG\cUKsZམL"f+b+B`rtz0U;@{AYcp^XJAlE*v@:fre%jXӦįZT ô<0׌ cz7Rirb;=xݥ7'ɽs'cw tU mrUR-@T5D?`62@كp$"8ߎ}Uxs{4%:hGVWWc(d,b6.-rJ1%\֖$V(h)s[a-F 8N(S.y+tɹt{eiGch;zbtm/+`|V|DG{z{+,}۫ nUn%W.LdR('٭߯{(9!:Kt݄Z/?n2&Y1㣿= cĖeL?'vM6oT굩 vN'EX)]~j8/&:QqӀr~q6Hr5L&[aNn5( phE \&k ߕEbNOd1) ޘn۠F1G`W =jzuD{˒{L -Ȋ^ئ,Y>;ʂmU}Mo\1mP\"xM6Px8n1-8&aφ(JM"}A $e-:ԺUY"kAyA Zv%1+ IԘ3ƘR@ *Kk: )u.>܏Hg!78aGΖ%M7m<$%/^o:(6rptlO 3w+%bשI%Rp} h;WW|l0\J׫޻Z~"%/SK|P'~?[މm/io BXaC&_.//-GpZ] C$ 7.g2Ng#E$;>Hsy.mP<|N !P,`g*_sX(nK$- RִHe+/; ,gaQ/wwLY糿MנM{0[gE>} "[~ w]ng߄>_Ӵnٯz

VKbB PO'*3*t(TGoΞk3?ity<e~OCUsU*kqĝ9=K7a%'ĩO)%D .KogeQ129UU"l JèuQ+nH3FIs>h h֕JƒM5Lpuŷ'J )ـOnH}V{SVVl-ڂ!HKyX8Dq.rBӿvڐ'+LxWu{[4m"f6ri7ҵyɭl mn<-WҎLȶ]s\K6'qw*@Kgoh-Y\Hz­nvncjkrm͝[Ebs,`]2{9yUgvVέnǾhT۱/ÇhIXZ{ϴ揧1T~_FcmӾҐ\eJu]Л(eV6m%7\&+.u1djԡ`&߹(B1ԥ7yC=*[wT瞶t͌Z{cС)VG FR}a-(W%#*bhIFKݩHYKV]˜:iEH}NbgMR$-P k^uSmgpm'}A=@V`=i\1ךhR'EQZ*"*n& tj a)_W6#dB5-Al!-w[I{G:]ڡiZHyD7%r#G֏Xsw=0 "l I16XI.yAr ޘx?_>}hjoy>QłU H4X'/NmckIU!i rUr$>#AFZ%MpiSeM(T Erfmj])A"\R$IHtД  )*(Uq8&=*%OYr5t裣pՏ}8ϥ2Ehe\TBN&+Y[kn,3d t1@2 # 3`d.ea"7t=*4Tx>GnqUr?e&aC8O" tnr 2,B(/eKlBXo2n·ܕb}]=(n۠\P&R q~M<KZcl髹q_FJ'N}. s-贊3眗}$OU}yݖAיf@''4ɀk( N~-OZi[NGiz fKcsqm͑*wmQ|a݁δQirshmњ Ϛ֝}ath\Gܹ 钜g&>qA˛'lp-2v^%;b=>eW7CHlv"8|ysk5ޭh23z|pg_Bj^-_\\uCKc<WǶ\8֏`߱Y[c!p;XMmab|Ϋ|5K #'7'XrQQp]ijv}',ċٟi.!N!0Yf}u~I8_ޜ_hqJ/YM7g/f<\jfY,rZ0YrlVReLv)ٻ)!4n-ZʲDSѠQDI)**@7|;ٿKĭdΦ(YAهRLƐڵB5ejсCe(nRO; ?4\c%I5Sب0hSVK$e(AI 4xD] ZIZr4]Mm=)|p9FfNSL_hFbߍPe5YKP :VBwBX<$ a D&!z)0}6bُ6F5B0_oؠŒڞ$<藝ݏ ֹLܑ?%Cg>s7:H I ticL&u*,S"- ZMIێ0I? {j >_JkÄQ 紑1k%T*lrOEɊb}B#}'}ʽf@F][ 5Z#P6.2:UޣU(?qv_&0  fag3mNlw/iC%Hy4n"!"/3^thZk X=6QB W:S$J\ꕺ] SZ2/M94~bKA6XzVmNU<ǛYPt|zx[ ܷU/S[t4!PAjˀ2i b}z`e, x~끍:U;!]v\{m@ sJIǎwYX' n4^z.ܑZ=@Kh9}gpGREHq -}忦f [Lw[t}1"NR±'s5,-^n-;s}w73NT~Cu|]䲏 [6^]sia!UQa>_H)b@uhz˥DߜOO=i}s,7\ͪ?~{ݩi;"ZxO1 AIYټ V!J%1>a9Mnd"zן's& ט2YHMZDԁwxY-JyŰ2Vļ>uTzf缬8BX5_g5q8;97q}x/'Oހ0|ДL ϥK1B[Ll/[oMe+S^FB^m87d$ O*Wta.aZ~},OWKJ7qK/btfp̓5_)F40LLV IK^2 G/xAcfH9{R6u:Q}x''c`pKprH8E2[Ts0ynˉa/Ѐ`o ߲ccbcH|d<-Lv7p$c''|n1-j['ޓ?\PyǑh;FD"]GDH<rs<lq{?W k*}cȒ)WUg; hSc;&sQ?{ ϻx|*1Msz|L:'c%VȘ9Z@F1҄@w|S+u r^mVS~+a:xKst${)WNOHgPǃ+HU;uT9vQ,0ACU ncD»JԡrM;kb{Z$X2L}L+0f ;#KN1M5_M<ԔVsUΖ c>H0.DA$`yA*^VX 4l^s:p2jj”Zr:4*/6G6 Px'R'#aJ=3cG+#pE3WC[r5^ߡ7lv7f GQ6mňi8aCҮ )D^jmieT z.|`i/\#8@5$Ӊ n. 04З}:Aˎz y* dRRjr+PVAu"bYn4z%>Eҍo}&; `^+Ń(L/h-ȿ`ETPkcƬoRN)aCvjWNOU#ڽ ,BʒQP"rd`Q;)O,Yul'Cso<8aJ֪/Yʐ ]SQeOm:Jeת=`q /7.>f[gZ-ly;2 |]Y1_6v"l/Ed|yeiI9/{Xɨvj::*&9No7g3h MJȫ'J"H4KxUlإ!%M9* Ě-){UI=ezqGEE@W5cT!>v-&?!XlfN̾d풉g@{8^0B|)ơ omP%bXfvc}"Qꃍo'_f@1L|cLx 6# S 1nxbF߳]"T0GD3 ;o|W!bֈ5$r,Rvm cNm?nJ==6YnȮwAoma2ػ1kD}lw Юp6_U\l+ 0܁EޭeTf4QU!Vd"Q" Kh<ŭ51_o6hon;cX6޳bK{D}N4eJZ,lki:Zئm~\@M٪Ud^xvdg_`;!M=[ZuU#60s<ӤҏX>bpOqİw Quh/KFp"\xDrLIĬq"!N+L-y~Nr@%P @,'YX .3)%i(|i5llCy Fd8 O*ϲs$-כTّ $,o}1 aIX}_x aԱ ߴl$>u$3 \YK!hRܨk7ɊsAS3u]VTct`/G˴'jiyGHKJ7\Y͕sUC7kvO DxHx}HP:{b#HKۏg޺ O,˕Ot9醸ҧp$ q=鄸 ʄ[!2t#BPB|bVt,L<3ie"Z?.d=1X2HpiB4BqJ=6i1Hs ~t̆#-61ۄ4PZLsU"{7XPځ|T$s}{Ԩcj|sTm I ,dݜ@wl1El]myOp*0aPfggK~'kfY zoH/{k3!o]& y'z!̑>޳Gs'Bl9kIs/W}@AX)…NJa o>z`-+ f+a+:J&w|+61NslPV0S`[/G9iq8!X5gz`?`D(xSyF~{ɤ: {m+oWE}~NR%M*QD!C:<9e,1Uq0imH;͗\a'{_H !L0a^mr,Q(ُP;g0n'n^j9y7O9x_댹ۙ5b ]7% ø׃»~pn!aDD"']_Y˥Fk7caifs1bH?̞3ŀPX$1מR$ y"ʂVE'{ygײ~^MC>*>6!L\]S^U?V!I{C"ck/8y\$pRl JU5V]h;4L]ܥf/!-fW:4X]o/?se _o{m`?s˟V/޵#"ͬ[!g688H238l .Hr~ܖ{N;D&ɪE_kL?l%LqgJmσ0se'锈'Sx11d&i\tN)ei`: x?qo{ux4>E)e&`8dpctK'$`+u: L{@mҍo{pL.V{v닗?~7óط/s\>põgHcp?oٷ/[T^O^\پe짜ҍ(h0e^Y@ZJXISMazbuݟhM(i/z{L:v?+3wΦqP·סW]dYW8+o+7ߟv^OgI8~=8 fώ\U?IjE*QO *9|R? ϻY˟,"u{ol'c!Eïc} sXTrM.~vuʇLZz6:YfZe*$m$1s7,ކ`OHއgI;WÁ;フaRTҍgfllo?U.zCzzt TX Fg_IiQocgB_OGOOgݟ6n<砒 =t8<(p Դ@hP hqPtyW?2x "䘃Px{>ׅe}9N}xRњO㧓ջofМB1Љ._ӯR~[E[&\_|E*ٸڻP1+9o.Hɂ Rr\, ];ɰ#,ĸ0D/~Y@ׄpq`nNzEb_%wkGr(׎rmr !,"ʌ}:T\{I4g g!)LB YfsqT= }[ V(Zy% ^kq0UɻJgwO˩\|K5euZ5UxKt& #^Hv^ԭ^T4eKsnO O^SlV} OpGӹ$f"MSM΅B-*/睫sUv]#FXpWqU@цQ 'KC$/vq*J_UYjK2]%׀UxFGD \Ni7Z{ut˓nFkOMk4D0TF3Fʺj˥dھ wð9#FD ay鄈N+b9 i>mcuMj7iu9,lf9α}L>vEvd{!>O^owN2F6W g0L̊ꋘGqbN>z$ƤVu԰[/Jk 6LQ닙~.?D{)Iܙ ̂q :'%zKҩtyzy N3sNh9[&J])UȽTkJZGJ>oMŗT6M @!y=iefP)4߸] m , rZ[q ǖFUcGG=%)}/)t A%-m,(nxl"`4#c+"RT;W HYj5[M o}<<$|3ՄSWm>3 Eu,VNwG}x}~]޹$.,[.3w_Bf u&lzfi/v[) Fs [`F~S@UBdPj(7$8ŊgH=b$jӾɢݤWp(D`7 fwA5Ln,95˓MOkcV7_#lwF%h٭UkFlVifd)7tM/rӫrse;h9#n^3\PPSx0Zj0bJ!NiD1_ߴ5,ikXݻ.hm{-hY~)ѷl^oMoFY'$چ~u^S/V#P[ue fmja|CrnL!u`&ǯAm g0) |3hrp֋+Ir4Z.xQnjxa0*nP^x[4=C6&zm:7j1ע T6ۢR $*D.q6`p`ބ|H"LS=qzzRJReps[ BX 8א0 8\ӂ1߹ SAYawy!u71 Tɷj9aT>dX1F,9 7!ʷN,7Z;%K?LtNk9J?FҷʌRMPx (FBAֿwWY!V_פ[@X6֍Vm^?tg{,ɁړE{WŸFl^YֺD W]]e#EuƼ 2fZDZfał7p&RX.j{GJH% )΀Fm15@FT*X@R_:(ffcf>ffՏukr-S^*rZ0"ȴ,&z<((dÈ9`<* LrĔ !Sn] ^2eq}}\񺍀M]uNn :Z!,DRjLj44@*"8$B'#vTIz;&P|v2e?+&H$i냔Ûy0e%2x~Hϟ y˾IZ5x2>hpJXY[*[Ok=[}nd:+GaZ:}Xp`iD]:9;8vkhVe7IhBpQڊGvhy/ QZ J;b]ϴwe z8ox@L ⪘yJL.e My k!bX*I|6 &œ;"NLpOy+=h^Z'r!ݤ W^*_'*jV^uPEbCSo4/Q_p,zKtyy񕑅`ju^S,U׬ Οl;-&:˽ʘɣV/Rۺ+.Nf-n>DX5&ȸAT! _|툑"u1Rk?yHu#F 161^*=ۮ]_m4vvC|`V&mk$|ὲu¯E]lL=yM汵K gk;D]Mn=閵ߚ@+=񹕊"rS"&4z#u 5|piQ`Lr)ȉ%80Vp5*4NAZ3-9,u>_=E\I8bCTsbF^;k5cx$RQ:XfYjpW|x=Am8Xd[`6hΝ#$F]`]:bċfUۆVb@V~LQ̥(E0M=TsguV9!#%8<ʦkKQ1Qʛ EDr=FGJO48a#ǵ Hn GO-"\@kj6>04*O&KLF 7q-. F 2V.=REDifI cL[NC2aa\9*6Iƍ9So/ΐ4SJR$j[54cQr_E|vȁ'\.S23 </zfu;c2ei1Vlw`8S)`shB>gL+s+0dR&ę,$ IX~NKw߻ɕ՗0WO!Ӑ`g M'@,,\&,Dk|.a!q&\'HDhA@P)%c7CA/Ưra`~'8g0BSKQ`W[(Cq 4;DxALU!4*3rs UN> !OthJd *DLb@ Yp"JAB*woU8R8mrؗm6%ԑ >N?oiz^yXӬ\-֍j&Ҹ!T&`TZ'4Ɗvnea^W4hbUd$)*k5թuW}S#ǟ!VEىDmQLNt ?U|;_F&c8#UMT# &D]' C8iP;Yi2Y2`VҜ@&j[C[*(6ENR.Ƥ>K=A9"f8Irkb\.7m0!ՑտحkV1+$aO+6rS-PTh tRA #ii:D.<[&F3u3=M3En+'fBZe2nGѪЍ1iH x!r1"EUMT%U]kjV64ې3)jl砺@˶i_E V}"<^NlX?߽kT)F:DO˳3]MdG8Y-DCεcҹ?g(]0LoSA>ƑN0'_X݅v}fqCeG}u5]V2*UGU8!TuR!" DvkgJmATOGBs8TSԨtI< Yt8ז0$w&>wq`3YRnSĴ_ٓ7[폒,?mgȩ*X)O$H.Y )N_7~OS3f>*ǹ Wjkc$#%rJr"tnY΋>hg⮅8eI~>H`TtMy`W:`{1֮l0 y9 IԧyEѥS::< Ә$á--LMxP7o_P8gc@9S d~yN%aS”8D88E;eZ:UEYvdgDAa\~. G /zq)?Pu'Ϛ\BiX^ۉ,7Ky/W#b} %_A1ڊ)x&}L?I3 .C&e$ Le n:I,)xe+9IێSAMj]J,D(m&'ӛl4TmGE@v~7 W7Tܿ̋>+_j0ZwZwX6r<_[f zOxugQѺA b)cʥ3Y+~䬮DLAWEc1d*&OiӇ+2.nBrJ- åA&ޭdOLדO<@I$7ARG9~Șkڛ3sXKơ_w8!z/nfG{7h楚G^\8 Ʉ 'P1mrg9ڦőo@;>1pTɾmȞnz uE^2"ODzo.*jmD~=k`gت=S?Z09;X; 2h'_5fTH8ps ?0,G5%ڒI$:$ *%ХNuyr /Ceȁ,@S vzkJI*4Z<*b@&g`\&}tjWێZv|g56LY(~__/5c9Y|/~w:\?.o1[s0IlNRs]HSrꭶ2t'pkI%k=DkT ='M*z1Q 2csF RQcR!Bpq*MÓw-+9EZ6qBq٢Fp68 x_-~ݣF+>]-OMTW4\K .2RMiNlJ%HP΁2Tqfrcc4{7J@ ٦5sJm= ǐ <7rی'z1d2FưL,oZ=T\ r>!mSI.e߿|z-(>\޷RUV/J̊&N}ktp FLK4N߿@1͗D|DAM.-glg--NwnX}w7A*LqsoN!ZqC5""6{eQAze" C2{JBkb |hv3 =]bly-YPR~@- n쮈G_>ܯ'͏"Do.nW6{=.z3tzwhBr7؇X24Tdpo_w]qzP9@Q`p{z S](n+ۼoPPzrA:VAX r!ȆЭ5`N> *t[LU%⑱︇:|=Q'pa|i4y| }5~;|B nfxrsS=9ypl-֑xd-=I zd$-Z?;M'Υ@0V~BIGLhw3ƥ Sߚ^ lNy(={ѽVr:>LZ>iV}QW8VO:ߺlHiul!"^ N5veaUCcRb6?HDkmt)̈́q)iE:w)sd&&1sg f]I)C! &b JY폇SiOLRAE'WMϯ*bhuw,t(AN;|Fp<9e}#U$)S9؃ҬUB&}$lLn;JXv:R])rII1 AgR׆aI'ν^i^6Bdr^Zr+pRJ)/w2KSZu]+YMdĠT}Tm`U/ N*=B5ɤhO C`Ú:ca4^F jH3V4* Hhc0j/.| ~)b}rV &ʷcoTUshZC|_xN_GeׯBj/>N1퀳 1O#CE)E7T t F'wY]o<.w؀yGvwsr@Zqc_7+GȚ/7hrmP7z1Ewo?v6}\O{h%+Q.Ui4@K|U%$͘qynxO6σR2dLG`i8U?v+o2ثRApV [lluo-LSeJ@HߎC; DkC7|퟿#&@$&A)GHJLR]Sh;1Tha:g礣 Y NaXT\[M$5&i-# %b$i'->ܴG:_O;Qq {*~:J(zy&;\# u"@y%t 癳5jYذWtMZUR2.T;kӣʹ:yZ=}g7<) e`'PnR*QG2:it*mN'v70%eSЮReqg&'l4b4IQ10QYG_x&waWEْ̱9cQwuexdB㬒idVGš9NQÜ^X}4!)d WqjSj'{K'4!i/$(lb$ysc.nı$ #~i :bw/dm'\[OnhF0=`k$ 6\?SjƧ7&*1JTQ5ΜX*cM28I Ƙb8?T J m R3 5"TV ,KcJ+tӉ ("6Ƅcbj`Kð ,CCLBJ㣵j]%NMt k@G@8[jd-{u0착%J8θ,1\#)c,}I YXbq DaG N qЅA'QE`\Y)Or4aLFHk LL͌d,L$s C#|ȅ GehSa 㸢!L*>} 1qKxu FS&@EcQ!/C4|ʹ(UiAvlFH:`MV&x`ڒqC*g#4)kSm d,Z0APT_,qP)Ђ.8جT[k^t>khӥ/g-\2 F6U@lkg7*w(7[Naklz ϗJw ~xk4/p25ħ??ߕya]1G!W)Ӟ˵)Q>^nDV8sB]!`_m! Pu 4CFH= a n<3#>"+jd.s郻5mD>~˂1?7q/sP7J]Wp CVќڌ2V\eZĎXԤF&q2cũQjHƕX&&cq^[Sfuw}{8#OlJ:}F4'7# s XF)`%`ž$ 谸„jXɗ{U݂peW'l@kD)Z!w^ wY(թw J>j]i<5`cDZ4HW g6"9Ȫs)R5ѻ:/§dJUFZ$V$a2A f c$a(V!XM*l *7N PKߛgkۇSOFW2{ 7wߓW:1JtK^ƌUB| q:*KyNyuc/J5R/c.umnl`uR~0~Uo'k%VKCT?K.^KƮ}U Ը~QK榀3`/~~rqnF#;8|- Gjͽvv&l2r7?vr?9]cfr&э8k {cH {?^|p?]5جW|5]Іz\_G3M+(ooK{iu儶e[ZҽsWr?+Z3.bz Br]]g >OT<7M^ 3*?w,nwS3x{ELYas-'ZgRgLPDQ&&֌*ŝ{4qC+mcżl+tNYj@arMbRR-<#`d\k%Ț,( wK3^3j.t=nDyAv0X\5bToZ+T:}]kc)k s{,$-׺g%̆TkI eFS-% *b,USj\Cfn$#šUev[^L*rJ/cIL7;9' wD| i$lpZXl8rTj%k? )b9ةgBA; tJˋqs"߭+6>L>I~wS ˜ϣG֓뫧(ߴ>h LpZh4|4_ItTƦ ogiσ_/@Cjv횲d͜4n\Z N6FD N*x):_H x#CVLclRYetT(L*gy55S8RsGZ$$JӸU1c2*=tp 9; ͊ S ˿4"OɈ\ŕhFCFl%,Z[ګD+E⩯L0'2HJOl֝h7sKe'^tA7L^o5^\jz H9'igxF`Zb .Ā3;^%?(va,IG'Gt[jFs$]s -!qdҝ!x-|8°a ]g{8+m-K9E}B%͡Yl =sP'j|̸K '&^_g#~q3T1j!D)Qh[6WkW_ӽg[_S-F}MzCR(&.A\Aі]"8y)RJ}Gr0@p"MIt^IWN54͈8c>u:8KiSeHu"A\ uУ&e8#+聊JOWW"tz>{eT&C$B8a µ@s=&c9~Bq|k'ERexfs Mzz72v(q^ UЙ:Cg 3,:'!BKS׃I㚴&Jr\5ͨRDDki )ý\_Q/c 1~ 4u`6sfҤ_G?4xQJ?rs1}zbc}8pe|s>=X8kn08 (#.>* x㊵GAW6 x1{·;Wi *S]bJ/f5{[%R;WhkvKy{z>BްL?+H^jۇ>Fl2N?>Z3#Z|>*o_>=ݎo+A''ԾsIOX|68VsveT(BDMP7,JٹnнS+P. VomX[犷omc6ۚ+}t|N.v.QiJ 6G5*瞧ŭ7-}R˃Yѽk=I6o 2smc2D$fQ|}A9WUpV8?k~8.*N|Rd0Ǹj1# -"adєS0$ bmUP}0IFts^ D+#a1(fZhb6koGydkf!ŗ9N9BR٩٥d_ q!]~p4xu_iHmV[Xd:mK!iXd-[w Dw~6 ebSr%c$%dъ»>b/ ]Ctvxo>.q͇>Ztt;Y__cR c>L9F9±gݸfX#{WT[ժoֻs4*"L&µo/pg0|^ڳ`uuGoy]1G!g ?p:-=~L( E 2*&xT1s)|fŭ6\`^ۻ^ _:%S=( XR~Z - sjBHטHR@ t$ 8Gs4g `*tQl?j* A=ܿZ|m'TJlg?} w&D}O5ᮩRWV!vd\C|b/",Q#’Tkzie ЌΜ8WCl'(rZ"&Ri)ldI"*X@ckx(PcL%TG c% a"Z}^E43ٖ?˶&DIr7J|x}rAYՉV'}ۚ..7wA@WrM! ϋ:BxMk%ϊ4Mpk89gp]B#WE9]mo7+ \+$$p{HnBk]Yr4!G37 9i*d)t.Jie5>EzUéWt_œVV*zUA鎤eI=>=2w1 㳢 ǧ'8F146l-1:ztШf|l|ΩO5C7 zn!0ft#i/Fdϣz h#IA nO셾ʮ=SkZNɜku`'8yle[4D(\!)\9Dp6hie>QSd08\JW['- -e_[gUyjq}~`֡>h{W9!i=HOl8ޛWnY{,Zq [&YbMQR'& ģVLE#ljսܮiƒ`y-i EԄqg|Jӓ+sQRiT)5rӓ{bLQũ*W2e*rLZkq!1M:FIBΎ@X"~[Y_*R }ENwu*:'x۲eK% } 4Ø{W9)ɀqpYWN%kIN&X 8)ݡ!ɰbmċ-L"jϬ*1OYGGGSQ'd$c^z˵ESRc xΊ(ϊxw[sgE,{YeݒQ" 8fVUmM&LΊ笈L/'L*:8k jf NMzf#S:SLjjT٩vv:z#2A`xi!b>YfL*>^׵nW˂Vϋ/.//V%Rr{WfW|۾?8m߾+󘼧7Hh8xjٝWoD+{7\Nz5PLZN(:@&Ҁ#-ЀW4B:󮽥78oPͿR Iܟzް֏1r ;HU 。b-psYT+:IcvTA'JK}KI oʼxG764"5/5g_ؑꭧ 3x ' *B&sGwnNIScr{W˝ Bzx\j 3;ǧ'Wn1(FF ؄#\2;?\G( EtL{RƇO\r9&8Ҋ?_'wvJ0(J ZZ~> O'#$QG͗`>~S!2y$%g^ [%]U9Np=Z2Os$F-q4DdR.y$S*挑;*@JcԊc'aVKt6tT&= aQ'o|P^D DE  (ўv G\/&Jc*SLumUYr'/}N)48@%wn[?>ż۔M=hQœ Ekbdeivh[]5Dɝڄνv*##vjT1;C]cz׷uZQG2r_Z̭eIB}S'2:1g;};vx'2 !td"|B}95RQ%w=!~jf{U! tAE+ 4)tE:Fpᱶ7Z6`Ӳ7DX[iͶvAKyNaj -I;_4rPy1\o3MvhlU _ 'YZR!LtY &?y>O+*FL,@ m0P10gϾ 3 ٥<S@;L%a[Bx" )%7o=#_7S F6Ϗ{86|{hsސSWZ/g7>ٸ1f࿗vZUkiW]u-m}KZZꕷZ!4:Zt b ЋqdS7vCjzs;>CP_?ۼx9nr\{ 0*isئB,Lc&5Ag멦9N1V٦@Pr)4%@K ,GP$j{61\aU^2"y%aO"aAR1μϹ%#a6;%l:N9K"YC껞^M໗  /y &F9fΕC<W|o })dձZF o-kQJ(JDO䶲p_}}yx=O?=|,{z;}[Ǹzvܶ$c`^?zQcK .A) I,ƩM4<80x  ᘤpD>M ,FQ^crRD.HauL#}b8h7u/͜Ne_GY8G^N"CjѱpiY3a>`$S&r2(P#sMj˺$%WFxg{eEJd2d| ' S_Ñtaݥgօ%35)q^9<(j̸N [0WHΥw~ಋ6`3H $5DƐyꮤɅ}uY׀0, sZڹ(q,QxI^|(`՞6)hF˃9kg豊ŝ]\OHAg]҅&tiI;8{B5}b|.Kƨ< w]Jfxӻϳ&U؝&hk2ˁve*/Z@:S:܁ Ϸ㗇{bXq59ʘ!5 GC/|=&(.k ſY{28]'ieg@1z>—p9?RԞ6")_vi4ףkJ-#9՛"Ppx!K5[ _ZءaCņpݾ4hhW_P&Ǻ[xfru< 9L_]'l遲lQɈ/҅6F=úgbt׆Opկ~༲8TFC'ĜyثlxsT-%Puj͙`.[1;(̽+!@W7Pgـr^yom/CS=shY02X0!da d4g,fSUhz_Zo>ix'qr4\/⇏9AU+b-}v{8a z`o'CW{[jxhQGk紵Falm jnӓkqitʰc+p4v?4QzoWe'ղ7 (}JkPٯ+jsvz۫N2LB9t?Vok^ޅf?؇Wj܀ aQPI6uF a2*8FVsí11L"G9*gRsrIGU *TKv;rɐTk(!sC1<8BbJ(@mJ+g3FyOLb/uj# W+Ђ [re:15mzD1`]UrofPgf|Ba/q)Ug>/ViL&dkcFaX.w 8g)S$9JYSEX+l$49b%pj0 l+ĕݻ/9R<:#CR%=@": N<{XB/H}v@pcc7_}] ͤzHI#r(p.F5l>hrȌh.F?;IL)%I\Iqբ.2-)!^#\J7 y`B0,Lt;32wxI_n/N ??)]{o?U'_ݾ;)2' 7O&b[YbKL?BΊ?߹oV,ufקk> X![បߞ\#*TɲswocM€fRj@J3T+Ip+dVrS;-gw;)'*sdl4[jCfD.a֕Lyv/M,YNUIj4ilqVp[X~J pU`6􂱡S$ڳљY(Y)0+P =GYI`JؤЊhUv!9HCcc{ ! T c3hlOc!Ρ+WH[۪4]эu( * UN iXc3WE°46t+o K<sc^Cy;ϕbR\)VnwNpXF8^ÅF6 N``qBICRMt>Uݶ_9Š餱Q-aqIQcp`Z[0Fyf"҄uJqN M@ \ti-v:hmUVRC'^,l1,xR$ G4'繑2ѾR] #4'43D0<Жb-;TAD\\c7("hA8 f0E PՐ:Y@\NZEo&{8O-$Krxk >: A2)(٩#@,.L!Lcfm5, JNS7c*4qly0OHad (:DF#( Ή:xTmV1g3Bmya&g- DSPW_}Prz.͹`ڀ2;#\cknkLUL}ULȔRew4ԙdM#ͮy6@39"ҝhs u/x6.\yz_+SmgCJaPépPr+.| 4^ -<>WG 6Xpn[3M90֣ke lƔΩWgӫ] qGZYr%Iod82<9T L1D !2ON餔LZ8;C5|P;'%~)11k;-r0ztG6k>@ hB:s0[L3t$"`܌YC@'Ěo):1>_>AZ@VIƠ,맏~cJ*Z~ K]evr3:ՅE#CѼ^^գOke>gLd&L!_"lR'^%y;6 ")E$3]WB|˻w'Y9.7wA;;2fde/.0t %BiHIǒ *hr)&NS&2^QJЙad> W 70`7!# ˸3RhI*~$\+L̩@yAKxs~IhLd +E\%M_l[Ա0b HAkwقJE:2 $Ie8>L`NE'h 19 ,32S;jcqs'y!ºQ,j20yt3^m1n͙0NNRdJag*E#d+4cGW:')paM ?QhۥN;GEm%@#>bPcF0FiF +ɔL( &SbU2t}2%ɔ)<B0^%S♜splkc#5uٔ Q- QdOPؔVP[ySp;дV\mɤ2?}+M ׋+uN^HɃ}f=C{IƏ2RQ?/>qȻ׈Vtg'gk9* Ŝܪ4 x~RszCH]m@ZO Jt\~*98*S" @9 w \t;- RK&i.Pn濕5wƞV@0 C2 IF!T XFy1b `$T:u._*Q}aI\aDOPL>9=ɢ{Kj0aC]Vb)]h1/o7tCU7KIX7(a %~ Z@oyƘ![M.Za1Y{ͻi {po=9ۏk U׮<+Zo#%%;iQ;Ttk.\btDZK>yleť+B'% wԃP&J16DAizGLdHWLLis7T$h ǸGJV >ӃY+cF '0R+@ ;`RPfDd~9[ʄهÓ53B5TgM4dytU|2dV.Ny;2>N1_=7|,/zdbzsvz5rlC{d%$۱$w^WX Mɕt3]RgIM ~"B2x0J[iZx#M- 90*Obځh;ةMS&3qɎkh׸| *Ku$4S1)I?MJ-:~Tap:_tl-8~EI1~ڇsb/M-XZw4RXZ\:=81c)mMaF^1ϼ\5"(-+LTRёdJ5j LZX8^eg^U6g݈҇bGT0B0vM  HFYEw=0iNyTNhI0V%Zk#B s@D5Jč B8:%1B* ˫W5Njj]\p)FW 4&ky+M#SK9yS3C9kX3SBFz< zӦԔ^_f,5+g=`%i V?|zxKX]DELOӽZlp=ӚMsV6gg3}o l;q1d@5gRgJϫ/Z WgR'i50~1Y0AКХԠF] ]" RRқ&{kԸ@͗b_>vbIFIcw'[=$\GM!ąeXR9\O;MkNa⃶x"877qQ|yMuZvJQ,+ʟh6JX*Rrz Ļ*Y+766EF{7M@[[ Ng4nJ4 n[vn]X+7ѽm* *:ݮroa:bHY×ݻCc@s4MJ>}ͣe d{F&OC3lq>~;uqqy'S8y>{Þ㰫;SjA RGaMQ Irj<ɡ$HYƸg&?|.rodg[4cmt6Y%E+j9܋{O؁XTM1Ռ1[R5*[d>}%&eFK7(̾eوo'7۟I7%׻E^^գOk9ճOԘjC'Gk_J%@Akc%1"F+p8X .;њFPz:>韟N&Զ FWkz&zz& 5zSϔI23`D`bP39 ^) ,uviɷf*uW۰j)*)SM֞ 7YZ*a3ύldM?TySt҆j pF4]]:]ed]ZSSZ3 UŇK@fT];`iz[Y:CJ%PES Ħl!&P@8eV:(Ž+i%ak惎DIs?Ќˏ[orbtUb:'W伿 /_U+q9FQp*u2,&,}u{#s#eӱ'ONOcQi%Bun}چ{m ,`>ApH윗6ʖFʌV8G+YX*(7^~aVi ,)Pe!&XR(v?~K_ ^|o􄵒w4$4q~ZPfU(w@/[UvxR= ţ83hK9l -0谆9ީ"3iwQ`q7rvQ\op@P MںܥAiѰS?~mX jЌN_`R1`AGB(fk~qN9Q.!`|( ަx "$gtA{+*fHa@с-} kGNʔ{! \l,[9Jӹ:gޢPnNk=o7GDjR /sH/yጤ̀ini $5܏qDv՝w3 >CHOֺav1Gԁ~侁 UDw19  8! xI,ՉC#3Ҹ6 ԏ '})9R2YϳC-=JSi)J_r'?^iP˱yPj~QRBN^vՙbL6$ėcrlR9a3P+ `(▇ XPRDU g)eP!fȜ(fȜxւ21teV$ P3H'[[tםK"/9U RzXMVT568g9m,Y))+pxٗ%n(q嚢9I)N]; JvFո۔$*B^zOT@W1!$xSdU:)Y50}WHX`j)|SxoХ*x( ڠɜ7^Ԥti"^+CSx/@rD_U"ˆ8 Sif˲w~M ≆jt!faA _~5׵'}w{xM E$T5dzӊ~C5ſWiownB_S ܡm40Ή~*]/=oR?GA Ms!6J4E vKIXLnC"AKufd |l7*pq EՎ69 4iй߄2J(sĹ@s>TѢUdmhqFi0|b|əִF[ôChkj|csCJ1INbDu#zDsi*5,4")Jh+HGS< U1܉bno$ tT+Zn6yΤDٓ\LwΙ8rBc&p)cb/^Dq*8xOi DyqsN:Z:,D֚2A=7*ĈhCZTy.p֓4,67* 8 ޓuYj"DS2/ ii6/{<#D2%:IM/5GmqJ52"O+&W.xab-+H2n-n`ZY,T %,2ʸ d52HfEi47g!i/H.oⲉE'˿xզV̿|@KW>|$9pn{jahQ[O ph0jmRjVqJ** xzYrӼ)A$k#8(odskyqGFg¦/IH"iF ztr0<[J+{>h\dHJ rt6 )w?QVd=G[~WC3Z}*Fܖ'(OѰP##)Ҳ|dGv~&qMh^Ԍ҉$y[^kЍaȹ8`[~kK@X7*')")9finسnR68Rx'Hk7Dݔm` PHRR$egNv#I 1>MA<U&(ovkp|7YZLKVk@Ybrk9Mg# N7 Z:7"keW,Ov,dJUq?*3g؁. ׿OϠAZfs E}Cn/s:)hc y7j τM ?Y6Ct@'u3#hN@ څ|US-cpK6qsiD9mn2nop;]Z~Ϸ af靅G,L X4x,@֙hUj7} d]T=h'-|AcTvd.G~^'cx $)o.*ȟׇbiRyN2i3a\..ثfօ+vOMc8j(,,aࢤTB &ß>|9+T2rPy* =x*/pJVN)rm.H3!>Aﴊ |@u=r5ӆ`-^UKK$5ŒP39PH&uY7Fj'faZ:ۊIGXm}Z΁€qrĽ_t'st9za?smB6` mb7z!>>!Oy]4̱<5ӆBpKB9(1FcI&izJD۽:G{sXee8 NrtWH8\^Jp֡?+>iCB.uĶfa Ɇ7~2Nost2GߞB7;u{3#"8fvSBr++${K|`coxy60 f$K^GHG/BB>iyvu;_t,Ԫm+^SKX ew NP)0NJpml)A,Ŀ`$ϱBrWqNLZM͚ IFAeᬷ;l$l:#PuJHC!0 Cc$3"p)xcbUC*42r۪۵B^]4xOۡX6e-NQֱGRv PZ6FϛK#mB y=@U ^e/ OSKpo {u/NPIyjU ቊ݊qʃuh)&Ii26ѯLd.gq3i#f!zHN,ԡ4Œ[@LW9%t[ 82 TKM,V`gّuSEeùT"fCJ\NN 돮sS~6|7 ^wno bebvv,hs Gx~ښPUMu[P\&hp#p'npJo!m6T1Ov#NtcKL`(4gf@}:|絟P*w Y |ӑWAF* W^|\l:L<Jzu}~yA8}r##ùd‡lcJo#d(^b6r2$Mb_=l:2_kL RN$"/&,x$$1qHSc@;͋,X$6jή…M+J:Q^Ɔ%hYO mSM.@ax*0g5 5=BIbA*euGuzpa<ϟ{S aY R(LJ2$ JxKe(0{{lz9W `cj/t_vLBU5p˥P~ss+0iEvy%|j%9._rTUIwh=Ɇ9;,# zo x½?{ P}U~dW F%(Ģt_a_ע#R^Lh:7 9ƻm+O O0:"p* GY$H 1ؼüNT9L쓥QuRᇅfgX">#؈;K5\YK4uZkkj)AG\'~=Ըñ*Sܯz@2t}͖`OXT)Id_<!!'"˅8˔" ;gD](MǏٷ _>K_>~yv@ FF:[Blwt1K| i6ܔUze~ zm_&Ta|>O(0<S؁J>A)ID{EnL -c[`7s3b^qW;Y7HٿܡOů)] Ǜ0}nWHRa~eBFXm%ctofBM^t=Ľ.[GraZB/5yYDXq\i6M5(L'/2B{WE<`%gKUTz9R§+!, jGY/%Zi[p i4GfTi8ZhDTKHRŸ.;'LxYV+/| QdKjd-ޙnT'(X=˂-0ΫԑX0Z#)L5OT$oآW"q(2SB]'o,`6k96 V1<5|iE¡egGW1aV|q`ZM1:76WTD{W {A8j'JlO+b;262zfDgq062M0<;i:7x}cݣդIg*y֩ {8fKhfj]r`=YE'3VA"3 uFy/ 〥25"Ј`DR1 OH L<`ӗJlmX$[K|~Ysd5ft㲟UT750)_A!H06ۗ:bCtOmh[>T4Ors,rIAa썀>qsԾ*@xk?G mJHޖאjbkʺPa&NH 0O?$:NKćv,b3^^\ߣYQ 3?![<ϡm},Ӑv 'j84RT{ $ț7gVӶ]<yYdiX%ctqŢzfǨqgL'uӁ6WDyb=7QoƋaˑ[|qZzG"+q WrX#ݲkҖԨ>pP^)w'+TT`)Ƅ;h{=hm*HK}_a]AfIquA*(T>'_`AO]1|AC$:3L ҈ISzYX6C]h |Jpl5KY[ !0t6ίeRϟ:2ޙe(K^Al ք` v7] S)}1}:Ǩ.?M/hԢ%"/).L16d+^x8GxCɷ}uAW zJ[b_ st=>HhGeg= цRpVp%9uJ4L_ Jig9\zVDmAQqm>nb\2?Ww5Yu1&}U#L9,Z/3""IktKQw ApbD2%dU"x>xTih D(v]#VEY1S2P3Ǻu ls+>l:v9NrWghE;_C&wb%\]>خK$uҁ`̺dbAI'yc S6 m0.4HX@NZ dXH.FLɺ٠!ad~݋8oMѫc Zj(pE}uUEu%孹{}v9}:Q?+.~r Y8c}2};do gl|ƈ9b۩BIuW1D=k59(.|ߌNyfׯ?ꜶQ$mM=8Dpr֤+sQx~1e\~G&jy$ű$yy,K)Ғ x7[OHl, $fe|=̫؍ڔP!GbNY]6[8-!45א,4f1 }o,&|\70 "=tjX*J8g2 P{r: acLQ 1Lah4dD*#Vw(Hw?wJp '5fROjn1  Of'LcO0`H1y&" ~L?-Sipw?(1Sd_]-iE>,VY B) mtR芖~,3f@rT-dVNph(Aq0.T\\f2qcG'Bv0R!vxA1L6CB+fTk J^[xQfߗ?O?p2b(P  <2;E"b㐦|x+AbB'3i$NC`vDP_)4!@s1FT@1>LՁ@>X ű5-7yg_nh>ۡ|LzWC|@qzEOxL*Yl\C!Ґy,X*CBtR"th|Za;JO6ԨP I9 [EaP% r4Qqd$ TRFZIc!$TAByː7Sc%JrDqV9( u 906f A$h!*5%(bZ R"Z 9w~{e-?ȷ |[ lu7Z1A;rI% r$]W׈Qhf`_OyI, _ZNSpW$Os k|p~|{Kā(*׿#w/)|`xć^| (XFy"᧧ؙooq3@O;pmeQ~g|>pl9 (0_coT?fih=Ta3m9y00K5?4*r! oPg/fHGh_pXB'=^f䬱nCHJ;SM2pf,Qp櫪*<^1gU"ct&C5"+uG_074d"x1glr1X V]SjRwʍƼ`^s&[K6])%5emFTA8a'g`&f$1d;Z&k_I.x}NBk% M"Rk.Y.K+ `Dr- 5/=D*G 94  V`b҂IGK0Fyl2l7j*Tm0Ez]1 ѡN坬nPy#aԍd&([q07bcyc}\׶ w5NBI|[_KDH7| dFlʩkfLLؘ͚=L~u }5Cƅ,_ה6 bL8Z{ɘBzy) 2i<`vaf7V rM1Da#"7Y NPepz/Bcl:P`[_]O ` \ [4pe{sy18+8N | lmi~ۗE,#q)22+@˹tFޒ\[͍t^X(d r.ӏȡH">ӯpW99ᄐ<\!`ARxu7IdJTU!¢=-Y\nfq͊mR"wiM8?#yeKs]XxeqW0YqW0)>Z0TJ)&P[ѧ)܎3uDup{9 x|IwoQV]2E2 SC$sD; ȂS Lĩ8o2 3/FcO$jJ+Ghz۴]-{_rʗ .wVTk'՚" )r&6=Ss Kׄ?vê:aeWmʱOukgOȔ1YX=WCGJN cW2uښ[, ޹C -5[`֏?7(u0bv72>_ /׬AMo~ssWV4q8Zug?T$B(cmӢK]KkKy.|`/ȴǏάq$&-ְa aãhJ:Рcta~Ѱ7YsVvNü?P,ݼ/4D<n^#>Lj7 ̮R9QP}R?trFG$\X7kSㅊD?돿L%x5+8-େ2@Xa&NcE6,VT %`(rN"jDŽh#9"'jΈ3LPs-N Ã˩JrCmuh!ޠ|5[ |+|KRBB30< UI )^y ȵ;+9陔N*QhMCq`x8Ǚ!Xe{@]u\óLN+ AY0Xh(,:eW,֘̽vKi2/ٮT VQ<VtqTɜb. ̊`bQU9̵#a1x"F4Jǁc5]:[DK2M6 e17.V kgw!NAHSi/6r!fy9ZIT{TטQi` 8!f{04FA' "RuLY VJ S=3)(!^#R fX )azk wr[:pP0)3A#-,yFT0v]F55y`Y-^Ⴏ09Y>(RmeHV$4K{Cz) eW UڰUDnWy1#qVႯĞW13sZ1i_9-gϫkne8HVh)=kGhBz<]xD!ǍH¥5#JI:(=:z ZtJ)xCTƎACTI;e.[Ӣ,ÔZ/U\pڕȋSw3i`#? \i U3&s>Nfg}^+9$wwY(0z+&q|g+^r@gU@E{v sc7uu}cd24#M0y:K_=A*4\yY'[K^+6MpPzQ rw=?>Y^>݂Wd/0dpUh7k$>bJ_/C"W-oi:eaybנ5dG %|{mjW1xju~=u'زa2HFMbMRkENG]aBqA>rLo&Y4+&MF Ammċ [ ΋Ϝ,x7NXwEJX́l.eW\.1:y qTM98e5">lLPZ;o=gc1]fI#@ & -6̠eQ.0(e QҌآNbF;L-_F;+7U]N.֊~_wߤ~k+=FJSOz] 7؞d&ԅpI⤴/QšSӞV%SFD0:vb;\y]^ -f0Oq HSYRҧuy+Ճ)<'W,|HBI+%ur^rA3DRv E^Y 0j[+ 4D&Hݲ-UHz)5%%+5D5oS  V$C|[yϰ=rDuL ن뙫-W EK;&NڢS\\XBH'! Y]=dF&}Ť^s69ٱ˗X(j5%czt +/'˔Wpʷ zv9iL%Ra 6H3 E98mV! n/'aCvG"cӁ&fʭ[Ѝa߅۶[(m+D?-{%4~ nΚex8@ഋ~)X%ps5?]ٛ&/.ƹBZ 8hm8$ sF.98[PP!namddlc3*rb6m+mïp4Z)Å`'w/i/[ZH+CjcktzFplofz o";PwqQDuh6*KU]#?lVS09DI`٪Qcr%и'ɘf-Xӂi"(xLHEHPj= e9ˑҭl2v){Ep#ORl=e8wza&/- &K\*?]+ԓkÞ]: q\) 4gdAA9mh/s`DlFq?\7 \_*Y.Hdv;MڕdX4Is+-B' 3 ʐ1MR+5 )zCMɈR,XR9har/0AZ, \Ɂb-[꣉HkB F]_4'Ŋɋ$.T_>Y"Sks+Ce8+kZTq\/V)YQBr+w4j  RlE9j3/(6uUf\Mj$z4{L@'HF!}Bc`4V5_1XN{`Yt a2BƐ AdH8uҰ9cl?G$BY4l_Yr\8+٩@q:mSr*&òk 㵤xPaMbˆ1: JXq]OfvL=.TDSUzNJv\QK~h 2 rr=~Wى%oKa6P>NŘ4PuuNfGfp^,`W'}\N.NrE3fl_lo<'cU`rT{GůTO} w#.P$enO&[JMlWVje?tBfxDy0L6|̲Dd`NFtԝ^_??}w2^~x0GO墎_2>Ŀ{?fF8%[Hٓ3b~ Y~ 20,)z)Yk3 XfbqH(+Ld]d?~\y`r10_\|1G|]ޙ=&v/w`igyivgݛ/l9 oM)g4,{qݏa*hP\?t )lY\U~2?bTl~*}nt 8ܚ7_M#s= zs]=z݇>h+n  ?1͞ݏ_OfvLPk.^ &᥼Z_*vB^+53on6VpoxpW _pm>w o+mE{ }17 F( 5R"K$'q29 ԑF0R?&^?>_>|Tcr?ǪcLRΦ=v:~y]wF*uxzwOd4}<|70>ͻKnNt*yWƣXW@Kkʧt?N3(!y:Ǘg6>QN@3j;U<~vg?hL5/G=k_y:{('wꇽو)8wzQ(/prIg|s|7VLL8|O馍2 ]ʼnf*]\QADFO9yzR.Ȅng$dr*5B{_~"B4<}}N=#SjdZueÊ;S1).{}r0j0IHpqk3oDj(g88*2T@98H C$<3/HQiJi;@2zīmkT5lGoɷ?_e,(oGAcőHHXaCb σ3aJyl~l īES&Ǯ͋=r.~7O`01 IʙH  # &w_gRLM5PkXH](@ 1<|>aBa( (q0bˑ .7J Rnϓr~ Q(> I }'Q4@$)F8eHI '@qU)VUERuחs{y_^N%'R\SڄZd>&Ťz2_y ֕+ӥk)`F5(rL= rս rqZ(e;%VJ͹Iͩ: -VOyKOfF_*(Cn*HH$Q-Ox FQLI~Vo2xѶI)bbiAP}4crBT`Ϥlx$p^ Q Ajlr[# VNl=c,tr (D] ŸA_!nP3!p]f‡Af=7 ńq8}FE{ TQ#ۮU6lAe P:_1V'<nowi~w>< ]?߾Sw7Y_:߽ӝgFk{%z|;h1}{nt_S/ċODiytt`ŝra6\ rIHcI<@zP~JϜz ssk{g1-YR;'( ]k!f0pN?ksl5)GsZyd|tKBg]c8-2 +lHDL}QAA)ĉF~"|I"(1gV46ِgSF^kWrNڤSG0"z207:vFG0" !c{%̣0E~R! RE>M&BbPgjCP"bȄ1P(~@黾ы߽6a]m P}:f;r!5yYj`iyzJاǁ/wkM2$ HB D>JY |IQ 8j2R椪(ړT]Q,~2T ^'&liRz$r>b @!`;mB#xĕ0NPccT R (8`}A?) ؄RVmr*T*RM&E^]M2# ?"CEJCi01؏! e_19P]Ɗ`$J6'Vmr*T3EDU2AH(K5@꾀'"Ri`D9% @Ig`E(Ξ+7+p@%BQv+G)Q#'7э3D^;/{]Nҋ-q* -)/'o'(?e']Ypj2M:ݚ`rSnYՄŔj1GїfΞd܈s՚ ko43^+ lnk!ɄgV8lHkA[r 5+^Qg|Q&;CbLymH™E^>[M = 7G޺7a@]`f&Κ_FQrCoԬ7YɄBg-fa̕{1U3r*7+^UN+R5SH܌V9!M`ELy-nnwEҩl^]t.>NsdhFuOo3ld bW ĞQS%vƛ4QrWEgFxq<NSdB8IŁEifdZMB!\Q\y7G0`=B}ؕ!뷡(j76 :ޘNm넛鄍J'D߆Hjd+jm-<Y&ZdW!ւ6Ocx[smQJ7xn8y&5;I4ѭY;i߾mcGǭ/;QuT3JC̊{̌NuL13*XvU1+1ѱG(oϬ5À161]Ƿ7x6}W4c4.=~x\Xgulhڈ\/6DgZQ #VrI 殯lo@JwfYFvS֋FI$7#/ Rd IwjdJuיJZyC[/>:FrhƈfOLc/b,V/u^RΛ6Rr ]54Ev%۹knM i=Ӿ݌Ih!%)[d'hTؔdb쁒HsڈwM_& Ƣa8ew(M޻Il0V:wkkdp5 R%hf ZB$VclmMQح9BDg;Y*b${'d4 ŨAa\*Lm}i,&JU8UuP\YbU'P܇.ۉ1V Ae%X8ba16ժ qgPu%*0&VuBP(2ժ3T{aaU9쯭T6aU'@ƅ`S0Cj*a-^z" p? LB!9l[TLMhY/cX(uEv4ɎU=y2p*ڭi>RX [#U0P늇-Va+V#jT(Vk/I3[.Sg܄F)A_>Ϣ0/盫_o~۟|*sVSa%`b1//A~"̅)!(9/?c徇J( }5xqਊGZCh]Ѩ{ 05@aS~.KKU"7] C𽷋N3BnTA{EڔAUanM>bEZfs+d]iUh]撇MdcFZda+XҍLo]艑f?~'w_+Z@|ysqNzIj}raa67?XZl!Qqhy9SyDbCDL4y^o̔ϥ *" Yh),G;jYy 9`TXН˪ݨ[<$Og]u8(VBʘ0!HB@dD tķ[r.ɋҜJFhVA`PAwm_k~M IoQ41XYREIgHY) y9Ȓۻ%囹ŘivC-])T$ahR?n]Fp$pp(FVjqf7XñjU8uWy`5j)/~n17+@^qSj%[B}/E1%MC{'IGj^CnSZ % K/e G3 {uN mSw%ǝ+41!˦ǐ:p`r|Á`-чd p(np~v%j { Ef&W(qͣi+~&+H'@~ SW\Iߌ | xptbs~n<|$,re']O^Jf``ir0F\o8!u1<Ϯ_k>E&PrL{.)N/?cJYi t-5txǠ[yd.e>.G_7gy;}L6w_A`usָtç+ )/?r/޼~"ͫ_?BzâN_Oқo>}鋗=yۓ_/U(7W8s6罋^OEBxozw)=S5a kƿ?u'pXK PR-O#NjN>8 'U]@T^M0V褫 2@XW _a hl锑զ)kGV)SE!`F֖Z%y޿1G(\ u-Gpgֆ'-`^g%QnAMmxr)d:Th|9|7@nCq7ݾ7a|_h,xr,Kə~}!ޤj~7w&=/ǽ+0m|7bWxu9x Hu?~^ YRӍSOB)0ހYTl<Mo0t*#(NYhd|1W}9|=P^=k͛ߧW l Yto ږ:nK'+NrTe}P_B٘+2UKm --*f:?M ܛ(2BS(SLex&m֋6TX蘠hR@H촇}5,B()I@$a'T6ۃ53JFۃۈBo[Px1j^Oko)@(CȩDXq bVP'EV)6`1PF7(ͩƜQ&$ \NՌ;*Z86yVN-R~0 ;p<:/S#;W9_'A2rrpN6W^+ۍmr:5xɐju"K:5jv_/]SCXz! j|Zhd߳'s}Wuvj׈J r30cL ;ąg\o\lQq+} A^/u::$jU5}D7\گi)v6&gY ZW]N]hgܟv RD8G}K>8#FJXH5_ ĸmG`U݇fKa@ z=LZ6mn LJQЎoB~ͧQ89׋R^W!OJr2gu0:"X#LQSN񰓖q%gs]8]O}턫0\7&yK'~[OgƍÃRdHRgp.8K0H"cf8eϠ$b-a_`x3q11NJfO*t|}ܸu7KYd82w~v3p梐O+ dFM `,#BS2P&fJ # cמ컐x|j-'MGg((^|83|s`[f|5ϊ;$i?zWnc:3BTіLk\Z gg#:o;*5[W݀l-FkZ!%!M:gvнce隸Kdqk W_^A)%&y05͑d䫞Bq-a{>]14I~.~0~{.~f6O'Xu)XLς*ܸ7̜o3an"r˸G.DNDf Z+N)άo=).-vbooy:M?.|t8,I⛝y||oޅC ׅg` k}Gc@ʭq[ ¹VڽiG"02v|b#coD^뇊X_}qaRzj;^^Bo z۽d^UHKkx%W:2 [ |rC+}ӣTDG fZ¨X NKyj O| J}B.8S#+ b>j۵# Bջ:pzw|%| |!5BBWQ0cqauj > ߦX~ۛү-76'&(n6!聖kQ[l nPJ~ph?k(ohR1Υf ofZ ԸvƄ[^ -S>BfI>ogJGi{K9@+9RlNpuq"z1y;`dx]D"8ϰv&xf%^M̢ %^H20%ƁkE%!VUHy YRhb8O+ϸ6c̒(3K & -VBמ9$4hC8 ,eyGe9g"T5AB=F}DS(+?d VҌb8i2iꉥR@RHF"Gk 1ƌD0-`} 9"hc zmlDQ(ء "Zfm:YKK2Pc{сj)F聎z#iN?bNFLQeLQZUA)&q,)/sΜm,hzr+yQ+OKweP@/#Jb>{<Ededq(]7fĤ/P7)^1Ï`J݀o\ bGRPD'?LקؑJjAg8`u-`|Q%B}1Bi}P{Q))"©- %Lu0ũiX{[ TFv5LjhMa28HFXXt P &RHBp)Y#1&)'=e(JA^=w 0Cqtc@c XlnM᫓_jC9p9 VAh#6捓Ea fTF:0?Rd lfdG2IU"Ih0[.ʪ%2egſB_wˤ%[&?tfD`LrR$QQ=/hͮCӚDͪ,8{,Β{,Ϊqz591xkC  $Ro gCAjx>NC7]%w[)K$(uT ")>DY$dA-D"DЫhuYR P.נ @20%19%v9ǔIÅA`$(Y qW`BcA2b R6zI90m%*%0T9gĚ0('̀gNpJ Cry%[Vwe!GP)$QdGHA4X5="m)42#"\]w>R0-֧5\ j V 锬']OO7.#%4r2^F$@-`4b hqx-L{LEypLdHR|,%&Z1v90}$cAۃE N&.DJ%JqeM#UknsŚ#MSڳ 4Nu^ ]\j p$>EGF($03(FBh&>'494( Ї,Z cm*Ftgk7Z@D!%W*VNyE:-︚{s5~rnGs)H#R['uClJh z,h~] \$1Yggz$_K` oVӵ #d`!Rxco#O{G-uS-Q-m;Kb 4ǁ QC쒊ӢQ{"6?78Lׯ4F%t-v9pKsiȱkmC胔Q"hFq`;~o>M%cXskfV3k4f[ˇZ>?__ooГ}`T?{wG)-VjᩍӍ?GtdV#V|ѝ}aV֤fh0G'ٌ}tnPQDlJ}nw2mщlaN V%Thu/E&3' ٲucIS|Wןtuz3:2) f_%QcQ}+L)܍>sӸA2tyx*ukKe_IN:C@ٝ33{ a@oc}܌c~[ƝE7GJVr!j ݎ2߅>quG闟}>˜d:3Rf^Y}ǘaojsSo]#mk&Bj\J`hg!}2۾N`K2hөZO+K5Jj79N,MhdJ2 + ]ۧ:<նwbi=[-3^aj0zh㍡x&5sad7[ 7n* Dژ"Z*WeU?74I:=~_GƔ!7OOv(rhOLC}^V7=xz#r11صAl=wYY"VE-PœelJզVόXBdzeONb ubԫLu!:0dL6PՈ3\N"(RdNlpi7cAN FЂ+"-@*jVVhù -Vw36ÿp-VZ8^̴ mZޅzUݬ5s5}#"C 35r2 LYzk7÷*CQHix^1ΤpEvge4PZ~Z}w4Brhi kڊRP T=i%4h_O !wKe2_1h7~X#m$<֍Φg_6er-ӱzύis.j[3W4S+61֟GK둴*MT:5? Qi{ح*l,n-0V]3fޖ;ܶ=ťiNjܣ^F[RlDw=XKc?  nۈJV-aWG@t!Qf ^̅{)Ld-m9[a%7ciK1{2ƎXsQ̴!'Vae~#Fr\MZܨմ[6G.,ΠW$%3d|(^uNJfZL[;bs-G9hxT2~1s.kiťbQ tj,)3rE!d.xc SzK *$F e$~c K~ b 1KMAd-m<~hVԖȜ@_NNQQ@^pm[0:cMrţnN8~}͇F$;$ ғMdY9Tn>R 2z)WդB}UI)&}E<<*|,×bUO_wn:ިԫV?kn?„35zENM}8mt)MM?4vjCE{RLE V1&$X|%d'YιY}# 8oEC3PbQϓqȻ}pp:}~3+nq_ADiNlr<7*|̴W% 7%m F[sE|JQ b5cJL4)0 PW2gϗ(J$2 wMq.)d`Mw!7M5'15D b2ѿ&D # 0_OTEI)&$c+.fsr̬erC+sUնtC;x ژk!Zȏ߷&@̺Khg hQ\}j%䰱ٵ¿]vct{JQx-A "Ȧ>ʫ&$)JV W r) > S6jOǔFŌςkGnPy7AznfVc{X\J c~yE)t7׈Qwx!X2cW+k QүJPk;{`R E:)r񉦪2t">N]C.Q`=CPOF"=xƗ* 9xF- tL遛 8A47pĿX3٪Buq&tM#yt jC).D:M+%V`gQLpxSݨ̮%jVMM'S֞-vWjWPi Lo=emR`ɩ&:^>0,'اu`l}Y\ *xR*OZdpN %' eBS/E|ο?e\gT@WHU o DU.&9Ck?n"ޝ>Xd ˉl}UZ٩lt8/0\ }|F8'4+~/{^"bػxx]y4=< Z, )Ũ_p\`GhNJh>Ez Sx. Ga8zA=XI/LGЋnk:{fozi8 C뇿ٔ1.Q\H`O,Z_{hQh<8vc羿#>O#鷑c3j| 9?x/ 7ګv6?>͛B(5[*Q.&l+ܽΆIC6_d|iEyѽrK$^'k 3m ~1҉gL.uF{%u,vmʽ}3B;ᆮr@ TA(pC^FhK԰tH-^od 1QKN.Q9EB!+$B[{ROuB;. 㑃2J^)|% %u5t_A0̒JlKۀ)5Wz}`@bWRy;?!-z?h RR.B)n91&΅;Acl Zfg1s} Qyxx-AI0I-7SD+*#iۛ\h;I|XM &awDRy_<~9#ِa{_~>dzn8*fz-~N\hgdC<~/2QAͨōޣXӦ4\bCEUL ;%@N>bl+qKS@'ЍG$B%t41.0eae%q\;&\4Н=$'%-=% 5>T.Rٶ6)yPr9@wں ~R~zMmyܗ.TU)^Z#6`d3F"oz(ı)SH2pk5J@: ̑? "뙃3y0`܁ٳJgbGAnX+s1TB <S8M KсR)@ތrGtk($3]m;4Pde\X 'R| w#!Fap@%:$MGeZ)VyFk_sm}3Ãu>;033"@ F;$aVi;nI`2yr̈́1I) Ȩpug }"=5-z(H;J_wx ߊ02a"|CrԆЂѺLH5=Lu5i ]83@H6r :dV %hI!i^y/ATFu8 J+C$؏$ 耗-LLLy?1(!}.ZaRԜ[4:mTZm9u mt Zsz>^  0K-Q#=G*e,wh=gJ;Wu9g~VTM(_x<}3J$8^ 0NZv)F.I/ KA.cp|r#uR]Z O";,y?̤([it7|!%.(..&YcyѼP-G\.W %K"@Җ-$8{ދh"ȍhs`H %MlPaFך 6xo]#&AGtVPf%`7I𐨤DS GXʄ 8xc|DJq(X62+!ɐwCqȻz(FM>߯0(ִKӲ<_`:>dSHFҫ__n,M M ( IJM1:iO.N3{QRi֓kX?*"!<{KW%U9KHgZ":+*";]\,&((|vJ+Ug;/ !IA%ى*MhֲК8#?l= _\h瓡*"GfnS5ކsi,4|kH-~ @#SiQb QƧHF0g$>rn?^팒7#$Զ6!T+aXAΓ^6'@3h sfA>HJݩ,Hrfm}cMrfikN-}FPfP0>_p/ev%m~WI n kDgXw`˅ 8/Tדi!]Ѐ߸)ЎWoEAv疟oe77o^]\vl"eL ?G;~wiT|vt\1\ƅ3snjv4z.nG+7oa;(W:39UT:t^z# =p2wcrlk83ٹ-d+ލt/5p.i΋k֐|ZY7z1^R*I]s˱J]ܺqEȰN@BWw4dьӷmR3S)*mLtn"vkx9o1)k1kp^͞?CA 8gOV$q>+gW& onS>ίˆ(l ëŸq4eSar%oN؍o|nzz_'??2q!4rv9kFs-/w}.^TČɬIFHBy1FX&8N.X"U-RIB_^N$ b~Zr<J9^ C{Y`GKez;0xv'Ob0]0M&T8,w` %`'MX9/#r)HJ \U) 6sh7jgө}ՔaGR.8AF($|resvbs&bBySBSIM`9@՜sͤ}鮌FfR.\̂J**u㞁IZ1&͋8m}8/)rTN[ά-GgіT Ë>?ς'.4 iFo,ipӲi0E> ~F`s7Yi{ٱ/ќa` ^ÚB&Qzw(5h=8rNXd0f4ΟzcwԖ {ik9{6A?[}-Rr+&{/F&Nɏ"j'}&|lLZG,NiG؈':uBV}Z5$hٰy|7 Ƹ,Rv)h5Bd>u2`_ jlğ-Gl4`4׶ /'^/)/U/A^(,kE{Y3sƛ_IWmt Z Wxw!bgaaJܕ!ATo}54+*yz8qUU?WPل&ȅ{]m=xs̕ |OYnH_P)|g1a% ;T.b<CG%,SV,}8{9V; $Lׇb܁YݯPG;P]lZ C:C<KWK Ղ@/? ҺŲ80%JrH/<̔ QU%t Q"+t2>ZvLoiZM)]V :Xtw9Pc"`,ighD躏Mw=9]U$WWQ/CR%=]H$]F-`)$gHޜm%CD[@E\ќpZ-JnИ3U1=@߷{S1Z BNNS~/Žf<.K3'/5=]3tfͬ A|6ȃyg,{wYKT%DsKtl+NR)"iJbsK$agؿCz4SGЁs7i!l>/Zl4XEecf67O?Y)wb {FNfb s=3.~ b3@@%ڋnN\"u}F.e2_HCS z0"Xn*Z~j@xacMy"5Z*$s(L.S*9kb+k`(t鱼g t~j`ļ 62>10= 0$MC0Q=)P3i8mmQۅ'hDQZ ʕPmA'*E/ $ItXm z2~&o`&VbD m[_\h-(”kH-B16pbuI-1]G +eVx(EbHsa0q^cSȦNUhTvȧDyo(6fEŋֽtH(CqV($`dUOf74C3=z~CDDH?7`O|gawnadj1HP+ n%#"z?vK`Fpv/:/?=yqd?~9gY2|.+:QF 8aV;F3*RAB(лw ґP 3UJ..,+uBӅ#E[~TQ³jf#e\LqIUYݩE|Bl .)<͝W1ȯϚϲ@0,70{o>| {\]h 鳤wd-[u;BD5mĔ}D!p7 Ś :W;y>O6Y0fK,1.X \o{9m3m`?Z|Xjsh<[_pj9݁[]Pyo]DZc7赏x.MY7+'GO~0)lz;A?wهr٨k|8?a:G"JpTwZ(Z{.h{h8$v $j9Jhul腌ߺu^';WQ&1Ң\ |u#Z+-mP;MϠ6whhB:κ׵-`CT-PIh DZa& y#te!@.)n.p~b3^pP)2X8kΓ(z͞AR2iJ|(LEFMu\֑rw2ܡϒaρg%[ZKL+T"z=tKcZCQsqk=Y > i?uUԛ$XY z$8`pq(&XcҊ(D ge6 SEl^ZuTs,{Ԟws&Ө:٥v[g~2?eaխ5+/^};-_;$"BAJJqJIiKRc˵! i,|%QU*U7 UZk#ɴU1 Dʒ |xaD$A:+ı`VkGD#;g:0":X(*j8}v8~+CZǤQ`C)֒tsIca+qAsK7U[~1ZXb%~B e~ AxbS@`(+ ]Jk/S *F.Uˁ}UҘ711D2k|AVTI/Ʌ**A$Hάz!`.@mhB쒿ub_{>&#ŭjݹj9bCUR $4B Κ(8zlHDwp&(dmBx7y~󯭑Ņ변'_!+49y, ;@5&AE[VI]6ǾE[-x>.ݛb糧_ ܟ࿻D")ݣ5?nҾfSoEY^,=Xn>!,]ӗA ^#/_칟eGy keI9x8Gi_!Q-Fql,ޞ~8M9Ei&<rA1V~f$Z Wbz~ڇ_݃|}49e)Qm1XU,("ߋxܟ}Xx>?{Ƒl /wH~.b'ƈ쇛 MiIJ7CJph10Nל>(8FHR[3fw);OsIږ6lBT K}-Z MT zĖˆ4H>{A~::EjG Y^ߎEb_14kppI<>eˋOټ7evF. Eg?E"&}ZgޢsMEGU.vSn)6JoZϲ(HbPtRǨWuhMU[M4ŦľޭǛMZлbPtRǨW @w!m{>[M4ɦϻ%/o[KV #{_oH- JAi:FG1]z-J~J҄pM0>yûq-*:Fvu}U-]EքpM).NV\izZO|3A+}|`քӚLx[QMlz=|q|kR IG2 )cI2#tB'Ye0% *-()eb_,iEැqtoJj*[1'A_ٺ{ 3A;up,q&%[A{?zz@h[\n `J}ǟ_cquJZ1FV`UoyЗGxy[%n GLu[{ W-=؛`+/7V(o?8EKi[6| @]LjaldTQ0=?lN}`)yXRb05odE o]iP 259iwqʧJ07 )؉ZLԍxJY7Z'-4Y3)$R+t~%(CȔ0yƘYN(͜FV9j1s#.RS$L +%G4]\Ɲ;σI#H'`SmhD:=(*9Gb 2V]׫ba9~tָz:³ȅ~7r ˢ[3fۂ2s'`m$ ɐ,cp3t,8Ilt R$]#F[Bs"+f+ߕTtNNT ͽ"l}\HjZ}ѡcNZq,pMb%[ğ2=MחV]u0'5o爮}+b1GLuTxJK+ݛ?_Z_N4~e'怮-TxTU)hS,/_j (r4Tu8 KD*H-R h3BlHԢ< )}nH颮@R*js ! ui_-ӮΦr G7z\3.#H:YC,0} 0il_w_h2/jembY$LsJD򴢙|ќ ;?OUŗXrV{BχW#W2?S#jc;jު` ":G.̍Gp7ϥ  JW*6EN!Rm*]r=THMKg'aHJ1,+ }`U# fn/-hThP<ɵ'$0ƓIrq>. "Me{V\]}qԅ[GP;lP iME +-t~b7.4S̆%S.|d>rdюGeO) /)CgQ RS4_ERʦ[VWD7C? DV4O#Ș(ݏ/ˋKN KaJL{z Ղ`ueg"#QRZvTOwO-oX< ~hŒn ZeUpȀE,2fe*h< 7qbXT(51 Da $PR Irs4rnAd5`#8m>@)@NB \LD Lb$ -c&69V_F ze:cC ˫v Jz}EDRk_ׯV`r~`_4Ip "2DͿ"oOߜA0͗t:3@S.&~zӏ \&(Aq1W|+&#Eat"|Tc ),{؂:mB )ۍtA c64n#J1g5Gݔ1F4+ ?1[YmBu-r(J)`_'@̍$9',0,O ; ]u`Bv6= -jxN#-EDke;~ة1q`(g̻Ix%Uk&(W;56 ` rR$9$8p X(2+1DKVI:3eP֘3;F *A!ƈq <H2H2Tx0 gST4o΍ǽU{-({ʷ*]Ƅm{4#5㬄Su}DcY|PKwkY۝[uPzh]ygiP`%6;۵VnvqY7˪\\B(@NZ\s孌H۹ S'=C)@z{6.\et<%tV_;st~B ?zµ[s й4O#٩-v-2tB.Ti7*ɪԲ%~vh0v.< O{=V1ڝ ngߵgvˀdZJa)k-+*9tσ1L`kfMu HVZ.AF,J:>qzyT3 6\oQ@%+ ͋T+09@, Q88(A{Cy"lZQ:8rԲDtu\UK#% Ts}cNu+SRQ ٨^ 1 6z*b;KbV9Պc/(JQy?q.$9 Ѧ"A9!Z1wTyﶛrw*arvq3Ua!b ~284(n쓢?ʢ!+8i1PN2 ,&xTtDn!FT1|=W9^OAlz v&`WeY,wb298[jw*"z]j\z|bRί"E^,M~]$~:.*bIoc(v#y5Hg6۾XD,>Hc!y|xswgG޵q$EЗ."pva#Nv?c$^(J'RvW͗!{3ᚎX氧Wd xJZ۵Wݺ\23}~yJ#sĞ 0ysSJipChLU'oi7i[*1:DmktoKvkBBsݐ))Sj,7ZZ螄+.n r7wbA磉R5:3t!һ\ isn5Wid\ ,8){es̋*3f9\l+<[gQ+Dz0ɵN#OEf_jŵ1p6 dqx3-mloD:5ԩ^38򎒑:}غMvkMx3p U@>xǞpH8PNojTN#TuB4T3bWo*kA@[}v/ԡd<08_8AV>;ق>{*ʉ"C!xM4<2<9f#jPkL@)̛XY|\ -Tecmuס V"G(uL\akd&92\1LX@%9ra ϭ%֊0{2MAP V ;>4FZ4BecJQvrU! ptB]֖> dpv5d Ӓ9v%%cxG!>nw0ZZ` o*&.I >}ld2n~y7S6`19``'gʼn'ʧ8Q𗲱^,fP::Ɇ.'RJ<5;Ž RNz`'ߪmƒXQ*Ti|Vl7^1pmȉR9v`d.*es39咗m'R4ϒ*?KY,F.J>_  Z}>|s w+7[gu@/%TUǟ~ػ9Lp13RyX| |7JF:K]:dVH*1%͈؆3,>M$h g;D!pKݡ fw3QlxY1Ej2$S FU9lt?822r[ ?/'`iIvo~FQX@jXKP_̝ČŶnK_ *9/f%@`-3T.b5\1հ_,}K&XÀ_̔pgmY(j=q9 DL!3waϡ;I Bq {zDPUH8~H)D O(PY6N/w47_+b $ Hÿ dnv ePhȄ$rI`y٪Y 3b&R-dG!ՅQ \;s6=1BS)+ŒEbbU w p/dfحe\j Cgij:C52ƍ64:\\iΑ$(csW8GBN+KtT^h Qe9-VRPC1m/O!8glNz!jüA|Z<}^Z>{oɌߓ?-?.=0/8` :Mk?<3 lVo*ϓBj>FkgW.gO?/~q//1<EA@Wo\**OAW8TIAjK#EKDư@5n. )[oH~g33iLEͯ`+xp2{enb#ZdF*c2Le.C@8ոǻYqhŠ9>Ƚ+(`T55lVC`>fFJ=̝@ GaS'[(ióInD@n[TCODV8MEeR5=hRL6v`lT&LLޟ[\ JhaXulǖX|dhYi{O<%Y+. ޗpGߘJbOpu 1 xǹC-G-aXghŃ}7}bs|\_-3=&8bLq#qe,rT*/׀9pthLrEh$R3J(KĭtzUH3S-CNvOR: SQ6M2럣#bd85rGM=HҒ5BzKK7M#=z;(i-s8xCPEkbFCێ(kzҌ/ ܭb Q3Hm#r$;kgw0E]>C.QVؼ* 0>\T^;/*/G=\Uyy}%WS\=Ls.܋{{QGqm/'o~l衞5[<>,_gXS2/z'Ҝ0֗ V5 >D(xtn2j\Z0TK'AL{wuˠpA\關\DdqKT.P]p`abpѬ42*`bwQb˾dsC3ϑ(TWK8[P.mx>Ip_J?C<]Jc2 3g(~0\AF=[1J=Cf]u"MΖ#vD{b̾Zj,%?xywj%3m^$I#~#稣"5cWh+0`{~ֆpY3C?\\W!/FAB]<_}m&hu1yΚ J7^@}-:B䄵\rEDHգpT 5dp"?'%4H_#5,Uy+&2V L"mZl~<}2E!lI!"B8Bz +D*g9*ṳ@ ̊iz60O@\:T0#)r鴠̀`;xy. FzQάɑİ:ǒ4jY9չJGUqUXm%q$N"t-&0291Ac7,?G0~m{uSTzB]D1L1 u HB<o¹AA@d̵eԬ7~&TST[Fu`giJW)!uؒBϽ`J^'2&VʹR,WҕIU+z0&OAN$AP*Ե$@ Iicj D&/$:XTZT.8g^dZ߆&u^yX<}^Û9|>ۋ?Kok`OǮ` 5g&|+X𖇧! o<./3tw#E)~P>@n̅q }\=@ &`x0"_^c+Llxc,+~?++wƊylE$T} S<3eϧPiZSeV*,rߖF]Me@cTdf>*;>,xhtc!MC?q[[ UUV J`} R,N&ˈ**jp֋(9d ݻS T6sWzQ7}/^d|@4$fhz7U:~Bub_@^*Av:DN58է%iDvTk`zv? ;wޕOi2&9R3m<-lx(-ņF5Н(w<_-+1z/kBYĮ՚9SE>;˨)͝jHgn86 `W}P H@{ɡRR>0wTS.Xʀ zJ_5c&1-ilv~6q2>4͇׷G0wgw}74i?3߲1)OnrտcAyЛr ؼ.)ࣲ,Ժ>\xMDzLsڐ}]xp!=chN\d$!/\Dw)Cf~j8ۣu Etrbxml\j mk&Z>$䅋hL52gdݺ5KlѺb":ctn{2x[~FuCB^Ȕ4p7k֍>w$ctڙ0Fx;;$䅋hL5";Y7RkWI˴5$j%DK|!ZpaDI'̂āpMT3n.Bf8_.%謋"m"8_`htCa,9k3<խ9GF1b[>,ƨ%,B8 x̋s[ ϔ9X֐NRporq5O/ u]>&sZ4nZ{['q:cj _{0/6BE=ˡ>yyޭ^\'` :xW e!Tր'Q'A(.tϨ<'7 (3'Nj ,*'kdm N':tpQ$iLO Wן˚Y.v Euyy9ȂEuʍtN6@t9EC upiO`k} )iϗYJ"N΄zcq]otdFPtHtleyt@Y8*B!!E)/%5rukJ7Z tw7[$fG C4ԟKRzkY)e`t^Ut"-) BIC P\\qYZBJ T2[@-uފ rdd0RytK5)K2 T[b T*Entӆ҅BPFtS;(vU>{ ͨf,}D2!4ûSɉzϮWC+|=ѐbrnϗt-Atm+unB!UUi3N c=^}Ih㓴$N)l:A.I[lMQ$)ʋHwi؃;eނ-\~ G :Rd)e>JǘP zׅǓvmOҼKM{d6<r>6?8f&iHv.#8,ZK(R[ U"J3+Sm N :>YYdأoҵ+{Yȶ^C|q;u;]@ٯH]@vyN? rE7QUb:])ƿRN>J&8Ep+ɏ-ԖP!˄`RKM,^ZW>RWyb!ǹ *RܣR^zT9 2hneԖT 9fBuol x mVw-D7O؁p`M < P䀛"hɝ.U0L_k|& |^3M`g*B2ɻ\n>Qp/:)o?\^O/_GԲ9Z漸Rxb}<3aoۖhh{z:N}x85.N=hTMOc> |wY8!9GT+B@pn&˥yFB!>ʱF6FqT7n9E0U$YxX\<ƪV4"bN2RjVy#Z2&Solo:\((/ݶ !HYMh` qJe.!&N"hJ ֗8 HK:$0.0=Ϳ4E+cAU 3EeĿnrGsBp rGh㔆'OL]tҪF@q''mW] \^fl3C:g¹Z ?ȃpSEPNBSZZ0e!xrb c|l4q@P5V$}H $8uNQrġ5q Lp{Jr&C(W{(ENeKaD ƚH$R"Yw,Ķ>Ix[2ggD`fJSJMSi]n]UA-kCfLRDCnvNHѸ,W W+.,~"㶮o\kYnvIj]p]4&,Ld֨]q"\K%,qTz"A*K̄;})CS[MrV\#ZbЂyMh\V4TVʊ¢#ce:e45@QJPC$j(9_*k*p A.5sVQ*+'+ɀ@qEϳ"_84s4|f7.,?opq6ۨ.=ecٛf}1E ė0Z7=e+T.y 2/Kje~L\nf1UfAx9ԺZq=,D).m($䅋2 Jww[`4'\]v7 |ꐪx8?LyG0&b5zq?_3w_kQ6|܂alG eB3 wDso97DFlJ aS-x2raMl|%ױɓ4ۘ~fK)ws᲼*?.RI%'܉LnXoVKbKi2\.я~-W>Y,= 7Kt҄$jJIh!v含Jopao1nkR;PM('vw$h+Sʹg^XZVEU:u2SlagcG2Tk(DJ\RBK1xd?δ!lA|#}/cc@(9XI+cEzT`#`N.֮,1rdg"e|ntga}wF<[@G/sN+F>e`3# R8R$WNK_ U_ъ A(|Jmd΁:RXrXd&<J8Oj_68"kR-0) 9mALi`Z9hz -!{CK9!mɹ z F8np*  xH!aSN9c`WHUR5cB-)"j^R"s_p6x-Y285Y^̹`J;e}UC(ܞW\pЁ Vl>@ \pNbh 1Ox`C4C1J! uL{HBY%Z4cu4G_orN"tYXbVOEKYØqD5|i$<д ,BE"` m|l^8x8PKg;(~^=̔tMZq'>"gZkEHoyߛgj RdM}df1EPs>ں(&~#.(&Wc`sɩ R?c 耿O߯-~) Z3 3j5TŠR7T:_VKi/tꉖVc/њ5[w8wQbPtj1FFKڭ>PͱBɗ5e(%b6_ݍbވ4I}?8y)C``Y)PzRܔ`K HbTĊMyг8sbEIĊ59rY?FI~@+ykI \>[g-E-{#hc(. i)NBڶn.h˟6I1uwpDByuJ癩Rf( (ac书-=CĖ@Wp(|DNv .htd.t$ BG 1Bq!8O>5a'PS?ϫg6`I%ƌUTIF/W'+Ez:y0I?zw}/ip*&Z$ŔΈ)dݥK1ӜQ}) Au@TJbhݢB-*gW39٫0X߈k&Ll!p( Q}E ᒰ)#5Tf$YҚIF]ճj\jI< R)D Oըe4y V5!Aea8@@ (/l6a\BEi0+(Lxw)g8б?"N8}Oѻ_ r T͸*pF'`'@i IJ&"TA` O^&`~B 562fB5aF#U$feѶ눌ӹ˫ۂ>nYZ- rKnϐуy%J>hǭqmkcVtѲz+Zrs~/><ݕ5n~9FJhq-²2\oJY NhH1*޶n.ØzDe?/uqN7%m\F{ 7z5^FhjV/v.D E!}FH&B[i0`1pЋ.NfDrwG'җo]_\5Z.q t9aNjtU cF$!!=5C$_q؈W jϋՅ 3ֳ FO\٭+WxYFLp22]c;ͥxxrÀ(=9&Wt8hפgllۗl!BA UC\yV#%^xv\᫧v;Y$ %Mx+LhVI2ѢsW3<)l[ ޯ$idDɚo7pPZʍJ7C8%}B8eC8 WcVp |yJF'(U27OP#BUtS=۶n.Ø+zV.QJ+FjZǨ ~ާ*j^hev?d/( |)5\J1|LƊW&j E 1µ8mT6Eg*++UIS(BM&Oᢒ&TI5S-9ٷZI2$FnfmYFι^Z B%kXGw_eo[7?" ^6F򆤤}rRMvaA 70-yKORANѓPZ9-r{B8ghIlrM!^Rc=S*$=z1H*ӋRUϮfrzmH|뛻OU-Νtv|Y vUc ޼nk?77%=컏wR~-뗲n}+tJ4j-?P$i'q, %4EKʸ4u0~' 50cQ jYNjTQGN 8GRMDЙKΊK\x|]k`𑢭!]H.?9k-)eB U2Z|bh6X(tܜXsu9X"6za.dpջdю%Yry84BQH"V03q8CK|%)73 16ػb[’]4(yh@d|Y;$%EVTL*rz9\TD 4@O4w7Sl$*T-z%XtWϮjr*C/&BiFS9Ҧx(hK5h% ?  lZ:uEȤ,?DSHqG9z ^<O :jUL#Ћz{dJM=dr3\Z ebj48] چ8d 02"VTdY`DK^wp) HXNDXcp{O]!8\_ebpCv T3qO P:_AC L?u~cC㍽ 1/7]OVM)Y6^r|/H]d:uPIGMQ# <}sq ~"ѩK\SK*Dy2`W1Is+M!e3O2F_84|W8})iCs3PH0 :JϏߺp}Cɚz>?-.]iqY]B3r̀/ڏfZ+BvU,]@='o]1s8-/ƪ!_Ic{,Q?c?ʖ; Q{#T*ujJkDɮ'JO˜˧Eȟ>eƧVUNǸQ Fʖl WXO8tqӺc& 1Lc,APEY CT)( 1Ag:ecCdW>f5+&´BUaƉP1.DtNٶ9^?XZ{i8EװFZx /`1FbuHxH2Sycꋢ-KCOo3- cE<fQ7VZ6μAXEDx9LcƝ p9'=O}D hN`C?!_ӆ+"y~~>:lS(zTk]7_뇇/ʧ^ 6{.۟=~*'^=_ٹ8!'} qrB܇~BS>Tʫ͛ݿL ҏZws^Q  J>Ow\%L0.s{ە Fk *G9ڛZvIPg ʁ&u&%j'+^P#O|j^W%^dk5 ~3Y@S33\A`'HW aDʀȸ4-Q A-+ATcrZ$;Xzv'ILx[G;^A|7;j*dhXC>IJ1"|Dp}CzT-z>)yV}/oqeeyL}U;@2PZ[lyh<<\==QV5cg6%vͰNk$9ϙq? ^ LQf 's Uk0,v~.*1Ow\z:MNN =b6ٻFndWYd6<3I >dO Ivvv~mI#X8bX+< ,+2Z!:Dz ٪d7]YAxl{JlVW-%%lpiiVe;; =!B bE:kY fK)Yhja}wuh"'d}ۙyj;ֽرOS2Rgc4W#[i5qh{PHC[?ꆑh쐥n!dJSaݺ#cPHk&]S֝n kq0pZ,QkL=Q2%a pX[@2Qo:RyzDNeP &h΄$1Ϻqڧ4dX ,.wHE;-cýN;RA;Hm5۝ ӛڷ {鶄b^h:;1o mb; n} ӕ[ GCi]j!bNy/@levD| ^w =YȞ>JhI,TgO!q6`}Q"p)y1=/s߮ Jy53ǒ; ,cTMQqzV}N.97݁.K%dCEg gUJܑ9 fw W, hXQȆc}J`G1$;[Z-$Z zyؚw|Mg0\vSVv<}Ь.߀%h054t=$2v?hF6}'2uieհ0VӃZj54aTZ{U X}YD, a̞,>VzzHEK'ٝXz Y_UfSm"0+aK"Fn4ۚJ7 )%2 X?X);e,~V){vpjClL.aQa5S+BXAYf(򼻒\+wXEx0#$@naf󪝃Fc:8jeӸ^{~zJnsݮWlj\uH9R3?׫9εVƶfPv !-g(jɤ f>c ..W5t uT#qA7[ܟ$6ɷwr*vfMMmQn/mʂy; 9#Jyتܴu#O4wQgwq)ˍg-*WkwegԼ#DZ2pߔ בD3x-/.(]^W~^'E!~%⃅ b_md0! kR6H^|aϴqb=r~4oԃG+.LÂס e謇]~zifԃ߂9`I{R~]MWbMǓKW%kd[6awʜk1%irnmewGٌpΆxclƠݜ-;a69xX2#jz~AZzriI)+2p _Qlp؛z/y<4WDz{A|oi+t~{`&8x@k3}n'31o݀BiPqQU!I8pL6NS;@3 y3./^+YO*(o\c%0e:r> ,:I/M]8ecER7^OӍt4x=xo"Z'-0QV[+A4[jO#q3F>*c)'J` tJC: g&NھofdHe~C9Lm]|X{g4Bo=9 rsV2_vVR=R'~~Twfw-?ɺ}GiXx=GO ۛ=bËQ(8wUsaL4Z꣓F9㐗Rr_ȃAU4b1fAbe%*i5`@Jnt~Zjgz}v%G܏6>J}WQ%d hy'-x[+ŤQ 3eA``U ^x ~ /O_Sԫ/7B7* WTD4F# H eEY2ʋ 'Z \L&bQ 9Ӹ]nFJxvۇL˴MfUv:SjVRӕ)ZTx:Ԩdi}ж R3ni1XmDdB;j TUTsqNZgNc+=Q[0&`Mݲ^o+6֡m%}HY6F:]xihynM&I sR$. ]YoE~cDXO `VIR ?|@|~E.G]>k5ȹzO+ "leess>9 91^đ hR2G}1>pDl_3/5!2mAQbrY  t^Yf,ȫ=P3 σ v-Zu:[nχ`Fb5q,wHkgNNˬƜhȟ P?M_r$fG$7}^t y^ wzgL1)Rӻ0[\QȻҕ'iɭ'>I~{ܿM5W[JR$(/eQj& ЖDp%|9-ty qˊ4MU VX`UQ%/w>OskN*3ocdLn=ki];MiZNZd,T wƳth.X!PV *eH;ίEn},hQ_kH89EڂTF|Q"srnjթWYiS\   6 OIuTkLԱd[6Hw@VA >TJ1NF]ei9=X^ElFh.kM+pMY<ZC gch&)dFJBpf{ X`1bRbJ1Jy紲VZHFǹh$[ N{iYV %#%G{LXh |xZ Ӆ[4uOSf,Nos]mGfp-sכd_;Mw~:o%,ޫO?Is4dkj%+aw o' Y сV{m 8+E{nŚwYcОdDAсes~a]Dh K6vZ q{E-;\QݷiUOAw~O/+ecd?῎G?O.z<0y|nQ?n~k[vSlO5_BbZm_/y-m)(,E&MXl(?^p g4~j>Y+]h%#+TXMosneo?|z6.tbB2mص^!8;1K TP-.\ȗ/a Q6K'`ҋd wߋ\`pańd'%. N3/b3f ^gERmI5-u:ta ԡn҇ys׸7ӵv{۴Y#ǹ׵OZKVpwƧZIEҗ[TBbapemXH˺~eQh5za[bxޘ!IGoFh(Igw|8IFff&~7K .Wv[|P9vBɝQ<ϛټ3 bϔ]%V1sqϼhr>>,"^v%OD ~'\ ʆamR[ꗡv>PkpQͤk-HlW<8%-; 8T BS#M kQIM] NKo6?R{%'.xG8oQMD8M5T)]eeN4&H@,nAf-\+ R!ek*iw 1tƴl- ~=KՔ\u%ƴɹ-^wpR4Eׇ}ynBL?AuP0*LRH{F ['ShQ-<)^o81PMILH$adzŭDi/x?{}XU`܇/o.^} ^MnWWu_j -æD{A#V˞ػh!9XdzM>3< 绿?핕++WVnr蝻@nY&Y*0( S.lGDX m,i0MTiqēѿ|>.(&zti/1.(0}Ѳ̠mE~Ԋ_"_ͫ7I8@XK)ɒϵlHK.[bdV*hb{:`L0VR$P:)I%gL+ Tae2suƟ_ǣM:v1}<^LWϔr(s8y4OQ_X~"vX$ͯuhMGSy5`N5TqNT +N T+8`$2#T0TDBJDĂ (1g/2ǫQ5)6;X1n2[k ߭jLNyizc78ѢqfT ҩRCf 'IFi ((Wh GDbHN&1"VRK%6fJ$R F @K &_zw˽A-Cup$ wU XJ`;ˠH𯗿ڑ`=Z3z%$`ʬzc,RtQl Xt̳, /U(Ä-yDGpL(=@DS BBCF35{ 3J02Ž n2V헢$6a!㐦0$@)KBԩ]Q!1+VYmJ%ຩh֞*4rt.Y&\nyDl}d.-y(y}<6\??:68U j4;l1[o7F J  ~F̯ѧ^|)#'uس64GofaJh?.{ñ$|k<1f`_p1[BtjƲ+ OZA.΃>[6(',`S ʖ/G(Ad4DQw!YK)NIn1Fh0•QB*e ƅHTM`őqJY*D)G>ڨR tz54:~2!W7"H`f bPNjCSC`4O1 3)j*fMF</d4dԸ(JnJMjLAB q4ՔB;H}'*ĢߊEWôiN"<ɷ=Zwtr٤8yQw_T-b/·XC,7oy!VNCJM qs':˞^IDrHK*l;T|uN_6l c`0mnYBcOa6T;ί HjcҬ5g$ IPҪ8D߃l)2 ,ʐa q)({0CIڠ[ \Fb+V @N5.+Ԟ]5p.w${3ݻCn|r:-V?^ҕ8`y8_Fɯ$P%z=|/ ^¯tAo,hfbwD3@$5f{k!|Dž;nGwA.5hdc4/$wC]ή 5b:^=mŸra 1X%4 xČȘDR&1Z1`.CѻJ\yMOo8'xG/#ܿJ}f[9Jty-wڙ]&'wPKBWʾZ16O%u(ՄND܉Hَpx5RafrL$d SE$ c-GF\ PDZ03T@T5$AyLcմLq"I9`T J,YLR`#@D1L.nJzܖojhz.'Xz@.)5hx!lڰ;@(n~6~מUhփOW ,iF R(p˷ v^J&O^~ Չ;,;͉ yMsg0tIH]|ڿ=Cn#prL4 KiĮnYRoyhyYϣB&6|"db]jE6L*e\XG#Dns SF94+HW#_C^>pH'wl yp>/`Szfٹ Y5.7+4ˎwDP{n.a܅n):7ї֝;y0nzr6?[g LOB_r/ǓA+J+s zuD(G4D7+JndUйa(ȡ6GC~֡,p8_pi"&dL353 YBdwE:r^:Z>/Z\Z Jݜ EyQE@4slA TS|M"?89Bjd8r e B+ֶnƕ̵rqo&/I`ؠCk;Gz0#"gj۸_amj2ƫl'')[.(-IM_cHJCrFpfȡĕx/}܇L(=@>ܐykj׿g"QH 1?居Kit\󎍭³*%e]▻CJ½0dph+sy扠?q74)WC.OzXGTjgHƁB/46|d -/:=ɛSw_+T \pѯ}UkIND[I ֌)"O˜F^7S eRh) =|QH8'TPEqscTjuָu3h1#7ro@R+z4n5׽DS*);SJɱ sCK4F,O,]c:ڹ/^x;/fSju1MP_zrӬag,WB#Uj@fEww/rco~˚񍔋c},ВullY!sluWr0= Q{J* z@}ENDLNyg"%3+(C(;P&/~zw/GS.0']~⏳_Ljv_|:nXCK?y_|V$3EC%%^k9嚹LPSeJHqg * 7Kiwe,Vp9-L (YYYY;8S aji.7Sd8*DPhrK3,kx+*fs }a{z|2/__$嗜= "cZuI%2sq:+ x&Ep(X1d&  w$v,=Al%=TY{rʂZ1>cZ0Z\5KeR:Pg۠ V]\HA C%LP1RZs* b#G.C)xE!Vj6+C~GSbP$jݬpVOeTzdc {zYy1īY(>er׷ arLa0+F lXqtqqgDO;Wv~SsV9!$ ]_uΣl.A8V fjz+$[AB jp8N 'w`'-!9&X)CF=I@[X0LRMA (9EN4 yƼL-D.u82ZPܮ/qQҤk͓kun2CKo-Q58 zEAP8|h+&FƤ 7;InxȰn}6]  !Zi!͔FLmn3Z_m'Vyb;s/6hslR $j +lDukܟj΀cfjUV~Ma`Rnl8r.0]n}f1zEY|zfs7wWLh5ܾDы+_}ǒ8q&̋_/~oMCrˠЗ=ْsc銛 mRz=aQb# ELi:|݀vkA}GvL; t++OMn]H7.dJyzv˕^<ۄgJRphBLFoȟ?IBt]+#k; P?ۻvj?./G Ey  %/E>\erջ_ߣr:LG.[$c}Ȓ[v4t1EJI.8ٗ2@ =V*_|`R@`j5yGZf&"̦j<ۑlXov舘oیX`0P1xZgw闥r;v1WNO! *J6_cxK)wRb!RTTOq#4`.΅fW8[Gm;(u!oo08~XAG^Z˯Fȕ oڠ'J봞dϦq!VgbuuSg1~ t |0johl9϶7#{l`΋jr!=otXck3WR&.WIFo,dIsn`rέ(A`<`=g9a庝2*x}vt3,~zpa4<ˋĽXEwo1K<90S$M .'y,e4˜2aPz8 g|0('. 4'xeȝ.iN 0)RGBgu)7,fq$ Y )XP 3BS, QĞi$5#չÔf5*D e+WI5$ ĂTU:vJ;ͻf7҉𛩨QV7<;^Ӊ$;ŧ Pzz<7eN  UuejM/Z\~iZ @N]mH,WPvBjo`EG4EG(lLz|`c3Ȝ^J\ 6y< !bH4EUakХ`,O.|z]0)YW/Ly8h蒌6਌̡P60 \~p_E&B^wզpjoFh[`"u=&Dȁqz]/hPFuwe QJzŘOΑY#2ABesO4 AM~=^gG)+ h[=bJQ}ЃKrQ'P?رTfNiNzhiv מhǂ y[愡cy8%2isv:!cJs"g4U{HCǗfȞ:i CSkVbAr:30uz'՞x&2A/Sd8*DPf,kx15Ś4)i|3LkMe=9jP|/XFìGvo>ͳxN?)d=z+?˥wL Ga?|Ð~)4>CuȰrvۂe$\CnL]޵{ܟ(J~7JocY"%a1p[]X LC1樮rjzߞ1b%cCuwR-ɔSD`lˑX@[[s"V lFrH Q Bȼ&9Zg7śŲ Cқ⇰@+rXb`u^T)zqv/b7oo 2HK+ud4p (Rf_`+5kRb%R Cl :XD.U[qn7h;ڍl݌L&pEvg_kwY/~IkN Ip>0>mnֲk뱃sm`gŞo7 NZYtGp Z4Țn~MfyB=Šׂ{NuN Ⱦ]K^`vp/JN!Qq`Mb%0,NstF ĹًXpZh4/:p.f_F9ͲQhcGÝ+VV! ^Nukwq(ӣ,nF VImUf 7U3ӛT.G9(=+,}R*H^\veS*"k+5 fCYqc8hh4qӴK@1ݫOݏ=;v ўG@o[9I٩;~opy$DPDm\48OFi\ꑤ'ẖ?FLŶ(54ƈDh_7$[/0W%( WKc1A2s8$ .gf2Prņ㺸QcuWݴa $~uW} % w*v1$:.v-M'%PZM1cs;ZûӔ %/{m>O6거DyOL(RDz2Pef& "q(|T=tIG߾rG,nr5JYHo3}l6JlU14!LRnzk~<ҥ,tIQ:O1@ـ͍d"6u%n9X׹+1[HIŭ|C} 5Fnq:O06^ L# Dk757\),SDXvDq֫~eAGxFCzOuׅ Kf\BVB,sJ#SK,QK4nR\qK8Vyt6TpTmP~Օήdr\FYfL -8Vb9Rծ2f"5!  5F)T3,0߃tc~o.ӀÝ{u >{81Ťg#STf'"?tT$6Ev&Ҳ6ͨCts>TJ뉻Qxٍ7{%?¯nܠm3 uՈ(5xj{Wd;gG"6*tqh4sPpx^!kySͭ\'o ȸ=7wq/|-7W+YtlNqx43#[2+];<5#oaN7W+^2[cRO+5,&TORVq 4p>ٲYFPPޥx"p Qy Cxt5^WW~~6Z^I^]Xd=TAy Z0ruNK=hfyP(g?Xݑ|"Z"S, Tx]GVA蔎<In镉ڭ E@(`# &e] Caa/{˞JD^OY_RuGr £TzwndW87}B)÷`Xܣ=}OY~;Lu`LȎ}1.tD{ ~V0BȨ ad m=hLVf8 d`Ux}X'?ܻ{`]f˲Zܿ߅4$ >9yoݲ8 4,Ӱ,Ne|9 c8E>-;;0X[H%8HSae*7ja>_?^MSUZV2)dyk{ RKO}:Y1S<='24i9aJΆjǯӄ"IPEJ(qjQK-aXb!#FB:SO9\[i" 76@匬K&ͿBOlv>QQrYXXZDD!r2m+8724\eF1JW@\6ŖIP5YUtȊkE|u{X\M>'0Y|{1~# qQB`H rz|^H&T:I+ n6RI"+cZKpJr~fn{JSV|5igt_O0үK;n4_Y>8 .3+6]G'27'gd9I0njB,.V]O>$SOyCU1 Hn) c/9&cW5o1HH+W" A `bckĸS5]qb3uSfXme 泯 ZX#nڊH hN[Q3ي ڶAA}J0(ۋT㆗JX``]} 6<ҍ1-,v tetrqa.efs0;È`@$?Wդ  M#Nw0(ap?ݜس˓7&,Vm57@+D`b`QƁ] FyCi0^}xp7 ?CiCDŽaw҂&sF݈j[#< ]Ӷ643\Tjl#3ь!%s1:isd1)dG ܴǿ! Ǘ؟/.$[1 ֣pF$Jha8G]hkO/̦mI5LSeQa6sMZ?38-ut? n=ލ`ڣ{ N۫$lb+|| ,^1ڄDpq['aK XX**8A2fZzfgQEU IL+v1J4eKf0AKǔ3iJ;4”/t{L+ku$Wf<6[ǼE]LKШ\!(>J~47<ȢV3;F2clNHX ebS4*J;l{-cDE^׷t"}+&K6щ hˤ-\ZkE1R`uVc+u:jgQ$x`L.(BD<21A'SHBt c٦.齹")!S!Ȁ^VA!M(wbTiP-Ar@(X ϵJ`x1!mscjW+B U(|wѕc28*0iE2c멭 N=Q׎&['=.8,lɶ }`&BniIhy]-7F@0◪[I1bȥd隷Ĵ5ԫ2+m]Iw5>u?ń h7=(vwO]<=87UPl:G qۏ~,s͚XyPl@: nu;$ `B*RwJ[j*)TSIa'rBK*AT\r"D]h jhc@B!(աeb"j,^%yl8sb]\k)áD$B(-TcU$mz G=:*r :bBp1&=_zdL *O~肍, qB\@Q{J%1*$ZP e3V"EZ+4 漻Q8#JD[r)z"0BHYǒ(ļٱab$Yao`3`(cF$`F)$=PzD9<%NCqQL$l K1HY:FNkOpfm\o\DdJ;PY5[-TH頽{,LS 4GwhQ܏w&oN66M-sjs6K9m!h~A?=/™no<-|/ FcWsIRx"Ztd8aòo`;Z^9/=m}&7sDTf]Rw_|X:ґ!TI WX'.o߲Mrr̸ČvBB9 lrGwN%HɎڃ}ܨœWɊlUѠ]}>@BڪTebƌnbX's9ܿn=h%S$QEGr&INU3$xX _^:_h:PPRT &o=դr1U)XIsٻ7n$Wrz" 86Hr nm+ѼDg]/hZ/-EAl.>U,_ꩭV)RH˅T#J4DpBkZsìYk-R-LE UuHcW35"QS%5-Ei Pd5ӵ+H3HKIq1jI*]B(_<5z"IOjRۖj džO"9-Kd),䕛6lDB&GCٮ2z?lIk*7~O;1f9c+zhݗG-k0#5vrtEp(#RD}4*(ah n#pKHK[LOV[ ]F*(_CSb~\P[ ex#lHDݳ N3কPH7RƉn[$[_0nQthNdR["*R*kAI:p5dQe1rJZk csdx0P*\JJ(;-E zTX+vFPLa1z:nG>ZL訿P[ a "q{l@`6g)[|ۮH]0i(c&{_$B" ԗ3m2&# $?.~O1sY+DK.4^ A1{4PLԧCw /=RL!9A8mߕ w!vkIfxo|+GDh/\9$ɒK7/+n?2RP]lxxe4+#k(FfrS)b([֘jTۍ?xA녟?YzVbM<7z銐A5 $: Qj1)zY_g}h,9Y &g*FxK LfNzx ة쳇(^9qC"(V"uhcQn,S7~#H{H[g:\hT M+qaD0]c=j V`FJR~L-\;櫛vWt=t_VSX=qy.Kj:eۜx9[W(k:q:pq}UL`v֨^H؜;6/Dcgr M-u,J+*7nrTWU,(\J ֜P=j8xc}Bmm'Ӈ;m1 prg_\49- S˛R=ofA [aY.N մ FSgvJ.~VN BΛf6)zaHԵ jP nz,K}a TB;" ;ԛeǾo5E10sI/emJS@5!\Z[B)f(Jd\P|%TܑW*En%RU5pU ʜrkKbD\X .U0BkQоȜ$b9Wg΅A?Z7. ! qsAd:Ƿ9mx ΟQP1u9PE8A=b>A3d~t~N.7{oQr݊(G-4Jr  )mG:i :{3xߴb%Q^y1csc4h[s#$'$Tr?rC>Rsx)e~w0[/qh1yw#[~Zx#]K$77P`xǘ!GȃPG-Ltz,*TMmZw[3b,)W#hxǘ$l򸘧uc6ca|#0vc^-[t^h;Kk]+s.̭vG 膗Ț\M:GnQoM׹2yb<(Z=ܵ<y92˸/ gy| 9A>ڹF˫1Zj4>KWS$ҐdlW=ӃoZ#QR/*ں4]oktCrg( Ei|0_>GB6.mmPΣgݎ Cq- fC֟n0&\{X382_:ɶ((HN4!yNp$ Vf'`#)kY19v`əY}VYA.?ZJȻ-D&4SG{o&aIwQț-!i:FC <盆K.b6su7YA5I^C2* 4u|l0ʓ9\Ԗ>YEU&o{׬mJz*ct}S9{;ToW o=BoG+Ļ'x oܒnQ1o[}\|o Q8Ú٦B撈|.~/=uɋɥ$Ek(pe*t\Fui8#UHaØCX%Z ɌTEkhA[Wس)LkSg^gJṸ.(8F.#Z@ʚڂ.LiÈIY+-r-VҸaZѺb&*W?}X3hKju K]9zPnheuz-5e8}L]Mcc7fE=si8\Xr9Lp;gije29 zoh>} YWq`Gn/8 r@㿿?ߧ֊f>6\>.&DKY>EKKB^km*un1xX BL'}'SnA-rV^-d"j% 0wa`[e4R];dʿ|mmKik7L?~z*k`FM`S®6_ mSoЊ{[+67|,ɇ \}U om%PJ A;;ICgЮu\jQhUP9 !ؘFf!ȖK8Ϡ9/JR8Ic"3;DGs=jHwUxl1zu8^c ĉ~N6W*H`>kv3&b>I4K:mh;DB^FٔPn#E0Ӊ}GvlSں[ۻEw4ԻWn16]&[TgnN;xX:UnC-r)@w j[,!>팇gݢ;RX+7m4y/Ҋ'?>/ bI%,ܛ8.Gqu]q€ )sn*q.@lԙ+KG *0ȷ[f cM9F Db%+Jc14,BBGf/(K3y܃WC~ղ4Ӫu񺠄]@DG;rR}gplpA16%^S(3rN,Ab9! (qg@1HspH__~,zE0 ݾ~1Sypb >z{6v cgB9W8J~pFl>XKX6,۩~٦37/x0{[A_B//u}CZXGt^p56 rXRVGtwm͍Js6Kqf+S)A0:Œ'M%K ARmg*3)u?$EH:TVhk2Q^wbQ%~|;vHVi-C[Wk VC^,BmtՃ<ːR@R٠H*90kiBPHP}7LhpkLՎճt rRͨk@;x5sgwF~\d\i[: q wq~f`.y5KcWCM0s>JEOr߇NFؠfvRή[8HxRi=ݥHP6|{E_)YovHzIa5JqWd>v>0oΰ`mW׾ hp_h8%}W>/a\MmcIzxܗDGMN|Q.!lKpE .n;ߦOhk"xT>4+UN |>[gw^=`k9'^2{v~"Li NmQ؂BYzFdJ#RP%0^URp]y9VLIB"$ܭ2X2d TXF/hqy4L}qF2u_M Oʻ+ﺮ ]׼2lK T䨰;J9|9Fq\!& $c9z>a;gZz>|4gcyg͜lɭ76[X8KvE+f"gՆ3Fl;{3Z/#>RL!%.T̕DPdR(, ʨ(d! )+ >qgr'yai3N8Wn${Jbr^8>G6js 8CHB-cJ\ QTLKO*$~>VXCX /t \*sE]Q: ݏa|=NI՘K^cg+k\My܀c9duGGlU񩉋J,˗IW]y|񆹹["YH-/?;a25tqWqV!t^ sm&Ƿɧ͡nͿfx[L:%!0F(nk:҄46jݹ;v;2H)t,nx9 U6[;Pֵ%RpMF =% )[.nW®B#ʎ*.%+Qi !JSS|m( W!x*6ZhQ -hrR++1 01eQI-݂̂h@9 K@EN1ֈlgŅq-'p0 Iɜ{ї ɍ?[n V쯛>QCn-%SBJ̬O0oXA#"k39}zWyg\)àc[ t gS.,)y41KL))iK)C6|5'Ak LiG`FPM@j*BalZ¼TF PT(!ޑt,Ph D0 I3U(aY!9eNhqkA7ױ„ho}ߞ|ϒgnYLO>|+_}T`<)flzC9<s\N |' nE{.mv6iwx=G|S;1hȹ[jpA%35lPvsNٸ S6]G+@%YF).?\ Ghbs>a3yy]yfM4ɦ}ûQ[*1:+2rBzn/ڰWnmJxhfys؋m7b20%.Tm+mPjh=lNSkO{WӃ'U@8N6dwA:.pk ѿ.5Bz y6H=80`{V#YvRpt8)8pP;6]>F-.s6j#৘qr\CL5Q9ԑ2i#.W® Zˣ}%C9$v8,1,8#$X^U)TsQ\Kr_kAJjjrԑ 3%{;x-?,{wv-h^^;0dM,\uۋ!DbH^)1!ufŐ 1G[٬'#S->z{ 2h"JPSLQm)e1WC)nkR6Nq/N º\CDwjXp!w( ĖtT`69Kqa|<̙m`Ҩ^5;tw==xf/ڪ(F!i(<x5>wm)ٰQƭ݆ }P,6_fQs&ISp$iihSTtWwc/NSY;ΏyʿAk3Z0~;/}3Y.$gtSN&{&?^o%%V[ q;LdX01i8g9ޠWZkn\K09n:U &1Ix39騤B 97%.W(j 5Ɣ3޲l#=AK>JQ[5}9]6m,K%n XóF7I./q.A側 2=DJ"Ӛ s~L˥ٮEk{)X8} D;Xu3IIagde.KR~diiSr- ~YI _a &4'&\9(TQPK!EO h\Đ&bJꪏ/N~Zo'7y =NblGS{?`yNn- m E@y$>UXM1|m>fxS[=%Rvv g@!#M ^C1p$Ҽ;Ƣ72 w~qޤ1Κ4Ĝ/Vw5WOjq͞ >¹mLx A;,#0RGݎP\aӚHt!˕F{xucn20/oꠧOV,wM%{anͲLnъѻ|Gb&7vz(#;2|i2xeԓSr{}9jffԲ-h3yy{Cy-0C}l3Qd5ĥ!j%&-98R4|$Qz=B.Kй0/4Si>?cs4V]CkfL] 1}%ߟߟW2#Pi!<̖cVDY"KJ;cm eJj+>8BsSt>O#bkAí'"76Z%JYo*L243ZpC LaxJHíȈw'qǵ=(U<6`=d :R,<īRj~bbGw8֤7O'g8??cQgg Bq񾗏i4ڲ<|^*y6$t^b=!/e7od u(zJ`X'u]49%'u[Uh]ᒔ%P֖T[X*0+#&H$CgB#+1iYB+LTꃒRt5HIǔ mI#8 /CކQ'`FDYLtpnm 0)%*Ĩd9/od #+t V@k1%[KpXDrDeAsJ93D"V#*uQ8 +_BWXJALHVsR/Wb>!c+7G,kopz?tL IĻW]y|8! ,ܛ藟ߝx][#r+}AUؼ8N$ǰ<%X{n;Ҭ)iF-{D6VZ^z_bj Ϸv--\ܖտmY}aOy'`lnD$(8paA/8 B9CߌyNjmT/?߿杭M~2uCtK/A|my#Of|1=?Yyl:Wwה, oBSRu^jU?^;_~`:ޞ/ z4w/2o)Wi&K(fڮeLp?b7/+TQj$^LjIK+[J8gH%JV5y#?.=o SѸ= -&JEYr ҾTy+NhͨEP]YҺwhn:eAGjX>.>zbFH-E'8bmLcFZ`j EQZ֤)[:ϺU(1zք'*-?곊o]} =~\1Fj:lA^Z%ȍҬ6ckV B@5Kl+2뻷%Zr8w96OOnHZ9HostB{0/pRi9%bQ+,uR59#C,>!pPqNUbnO3O>EyAM2ԉFuP3KۺE׺!)1jN@\q/ wd:^t1\?4S-}U.k_ܚsJfm0davipM=7j(2z^ F3CUzq1_@OiqA^f'MZ2)8;?M'7>;4jgldI5ME$m+ESVVQ ke+)PMiG\%|*CԊ A.[ZU`I*PR74PL)jJ)Z6 AY`%kk"H  S"yI(H@  㞀Rw>檈?`)w06P @@%≮aOwJ9Yp"acKD*I?/ Q(َ_[v|0lGx^,ڟǑY5_,?<07!?3+O9ga"Îly@(FFi}kwB>6o1*@MPOgp!AM!r 3yq fL?6^{&S(v^J.ywOt/pڸ?}{cݴ%+o"*/okv4i8uC;z"= m-QN0֜ 3 orAv)"1p!"4ˮ넗 y{Ȭqe|tкwTrݐ>.HLũJPLݮWLOjd Mē9n 3jg]@a͏3(V뻲l⛻qlwƱ{k3em|Ӯ'7xn0ǔ:Kʜ% COisSnj=;㠉Ft lҊ?>ƀu!w TjZA c*բ eE RqY5xKѺ 4d]}@+1:X#!7A8@VI$"ZWOqHΈ+\X.4el3ɘnVw_DO+Ss nM@ =N6a =\,-؎j-f}+tx^^d+{cD_.,)T(STl -Zp /; Z^Y/TYWUF#*UZ+JEKdԜVkY7 (nI͏|zZPZ=(7䔶MnEY(&X7mi(UiMZ8 {l$jQЦ5Ckd.IQvm{TҎyۮ)y9SBB4/:`%@i:@܂ -p.WE|,;XsU_~]0i%<#@(BG]򱧔'Οl4\OQBەL J}zxX&eJؓtoZ,ˢypY7uKqU!&y|i]OdQXzۄdGC)4L{/jţh odGB.tIL(Q:BfB࢛Qa9`N~u-5tV2Ips'JùBcZmm (RR, hU$cnniն#-|pKCOZj _IF?=ӈJJ{d( xx^ʅJM?%Xۏ-h59t^}~4B WvRûտ"1{o3JB6x,on8YP'Yگ"5_ADl 45I_=g)lT~q EY=)$PfxblCS7^>5.ZͳƖC~Sn@f.^_?k@e/3WY-f_=kU0*Oont6ñ1ʃ;0^lж_yO~z{_O TsU[?'Wo)JEsUvƟV˃SgP!0mn? [ӗ/_~K['V^eY&EN7֚[{zZq¤έSմN>APwi$[8 *U/(D GGx"w"u<}vxC*>3OuO:::>3κ?ΐZE>s*}u0<'o>i45+8'w>QKV b@tKPF&:=#pčt4ZG֩k\ YqAeT(pnLԐA't7HV׃ˮ{~u 1PD619~lW8 xJ`N/lB(S[ ;U& iGy  hhq gi.QW,%Py5\q.QSKm\.!W9eKmܺ{׹ΛH.T9Wu_OO]dv_]~J~ȫkUq~2pH?I yN3\Er?BO}qK0V T}j - lH_/(SI%BunY"rwXkdח?Nmw5%.$G)+[eC#u?^M06 llxUE$qDFǘV5h@M$|* -9mSТ@.B1Ǻa-7mK 2IڊREà ֜VHJޔЦ5Ckd\ >0$0kjs{KZ' Ab?qD:-e)ZҍLq#tBǏ;Kn1]0سY4{!u#yzzhJ㛺9b evTH ސ,2~~suˌ ow "B#ć9d{Y |tM:G93qL<1c#zCWP`ՖHI; sGZলө/[S r͞Ϡ<͞ChVi3_+Ej go21޹ryzrmDF9IxxOdhvniOЄݠCNgXRvZW 0cN)-GBfYN:SyƖ}C ݁' e;KGd'A !9F*EQlpRAVx.{9>O׷7׮ɫ@gJ6˻r-kUb/!Z?HH0ɫ9cbp!da.<..:%6 Re.l}4EΙiׅi.BRԚtոLjeeoRlPM]"R4ue S1UJ.ܠ)]!5wTdz$e|Ӯ\(塚.WE|*TONzkxB~u1ͻw~1K.D(B]c`Q֍?T ԒK -pe4I.ov-HmHM$V`J{7_%؃q& Ûh6ȒWgfmI1 (U`ǩZHKwucix2_`8?b9gZ{u5V 1 6e1*&^JZmI{İU!Tg1f'<b c'HKDkm?]NYra,nzP(kqYSI#~ϲtӦ)!0ڄfFÍl|? 8m8VR|.2SI*,Ua2i/©Vjn%c̬dsr 15~^1lhw{ ]}}U[)?[ĥdC9W|!(Ժ{y4~]8(@?*/'v) K($6Kj.Qs,hM:gMM ; x051(,5Ϲ쉈j> V/ٔ |WmMS'!Q[%) AhGs H4*X8g ^L7Do'`NcMãE?ggy2%.L8Β TrDd 룾uƨ8.3M}>F Fq>G?G{So>~fZ!Qn-/^S`T~ޅ͜71@xv}fx6z5GSOmB >뵕ﺭ}k#1!Y;PmW ;[F*pU2ӧvpTjVO8m໒m>ZGO%m>kFk99pڒޡ\qN?gp*}&fãZ}`׉V \KԿ܎5D$w9f#)⏧R έ'manՆUf;.M4g q*Sur/SO#spƙtKYb%H&IlBST(34ՆkbdNQ- goRFF>"Ap,SI,LtqNI}F$J[M&5,C.5XHAXfǠ{d.,Lwv-\PjV8iZhA |i^?gj~2~MGguM֐/B`վV:X^N,SJƵB>"ӌRL))ʴް#I339} j:YA] Bl.a˟yq8 /I_yN7<'& #'-:0FlsM,/5Qso <=ƨ0/jƅO6*6wi,l{](95!W˦cSn$CwRxM,f4daꔼ,k3,=*u <&=mG]U|WÔ!#%Fe1}L|Nϗ5 >yO]._˕6 /IX 'ۂ]%^q!9u$vL^`rN;#@bYДLpTnH :J JWRim2|IjTr|N9aV!aq|sLbdM@I#'. +Zd^HyΥhN8UV"jN.Z0TRSp2Ȭ@{./|eY|HP,tCSIef aBZtXIH;|0wEH&?ĿMH>,a<(lʐ}PɆgLYdg̣vvlN(OB(fiA;a\kdL>#cN $)flYkA>N $s$YJ4xŶiixbXӁ`[$zTA6AG6b&0W_Bu$3_KxoCB:H_-uH! eyăQ Z*ybio9%;;c׺?o[ bN_m71w(la[awl{Fq{%SwS5M12YEٮ4vFz3Jv"6z|Lk/7z%VFj*D<LM>;-(ʷ.%DUҊ'×/烯28||f W>X'3Z5U36F(Kua{d@i8:2omu`{#@Z(=|9L'58N LJЄ rtpMc+]{W"XRNo|gg՘#)͜6١͎6wd!2-2F`T{msdljz,i{L7Uz<~e~SY sN.B|W2G qr+PmT2 4KNS2LhqJGr/TKЦm71@%4mJ;D^Viy4y.`=f\0MN/Ϧwv86sp).̿ v \bj*d̓лԸwKz}ƒ򰠋plfG*G$ѨACGʮwVz$ N}+\vwjHB^֒'xvLv˃ѩ*> vj/4V5!!\DBnMv?Mm|oe )4[FRK>)%!JEIxHHBw|W!\ h;vMԬFfeNqGkYͼAܾQQٰ鎊jzft:#z1ۡ0 dL1*T2][5FGSrQ2+mbcr9yaSm1oqQңBfGB"U?e6"q, ]L^^*R VEJQVN/hr!R,Z/9CUv*+'W0A x {vcŞհF|2oލk~K!XRőaYEw5ێH u3EJvUpH*t:jSӱwe/2wI`6&3ZeNSڥͷG2@T|ڈF5v(J>e;1*sL3">jJd*H ` #5^jAJǸk C5-t?cJȁIU SlJ[ϹPhC)5ZcSζ IZQxM,fsS体:L}N5\[mEÆZO9RD gݻ]M$ngn4| jzgz},2ػC ǽ~nF?s0ɳS C3Ȩu314l60=W/ɗI(*j jB!/(v:p^ bTT{ P*> ;E.eh5b# ^!`E{5`Qߏ]%mg@"T-Na})&iHm%O8{.dPdI:dktdÉφ%Mr7mg&ܡVi/-[#JmH D ,Fč+ռ,wf|큼t? z:O1_f궝+,kJ-(Z$10AE=QJYAaw^)kuʒwrPfA [VvpRb5g᦯=C^-ƀ X9mMf`P1AZpݡ4m y=Ziv'!XGYh1% DNF amaР({#F?MA$餽.P1RxMп|S wcmuG8y6A$_I|]bg`pfeVsLC_!R<â=ʘ5rAIE[簞9&菋Uhw~.. R _Zpo\G0|D\hܱܽĻt\7B¬]W4PMLbgP…n M+$GEЋaU 6O1̃xvkF VKŬ< ~!lv]o$n7䲚2TIl]v^j--dF:Q|?r}_h\RHyuHē.n\JR.nEnnfx/bSg=;Bn=ʰ\&)⒞-43E._O4__^|-SbgO,m-_Ͽ!,d=#3 B(|Y=re9X t6܃l"P !t}UhmqL)3i;P ;heL?::sdc a6;D>,]@U\2>̗1K(s 3MuA2uZ։NČa[0N) -Me(A 3ΗQ0S0hḽ F@2m}J ZVgoO1 M& gA+&GyV -*EDZ\{($ٓHfQzhS(~k<"A-<@$R &s =,ͶCeh9Gevױ])̑q}L1̉Sq")q6C~%V?s3ԕdl1̐mzfaVt81㺛n/uU&yLGwC' 9JtA~zأo)_.U~Suzm<ގdž''iRQ k],|ybt|z|IiWAν35b.rEÇ:lD@L'()z]0j0(8]DZs~ٹ gAjasV2x,ti[sgW%Nkdz_0QxUd\+nGFO;t[oGv ˙ՉMs޽^!m0cI/b÷m;,GRV#pFQmKzLo %S͞=Dnn9gFSAjŬfJ3>>.:r7^ZuCq幹2޾O\,"pYD)>9*h-CFhR) ~sd2Xh9z_ڵOauSm çB#x%|-׶c▤o趚zW| Y|)Df9fLB2ƜLIrPZH1 ~޵-ջC(7j)mr\\!׉$۲w!Hk+  J'jub`QЖI**낫=v8O4&=m1Z$$9Oz:ACZ=.sEU!G-R[ک:gR;>8%> T4/Cj]rr|wMD[D_8AfR|K kl: QYyu; ~Q =UゾWE#^#)YnyCPX2TQ˫.\\َE5S^jw.B)>N?evX-PҒl7߮ƣ5FL|%؜u@5w``E*]|ە(}"z3lG!rRS U[цoYXS=4?謁UA?5#͗ŝ>GⰶUaA1v*pF!;j.߰ 6dU1N G[}(WqzrJ%{/yPQ<~cG0 PFɞؘPVH9$eMs-ƗE/_1lqS@xLdD2Sei"foCt. utN`ugu-Jƹ%sto]s5u{4p%,ڬU2,dku2&,z2(Mo2iGP6Rsd&@ɳ:`s*p1΂ȣ N9naxL'S-эQ-\ɐ oRR.KG"dz *ŲVaJQ䜃 1@I"T\ KSi\OB;W#2yvd8q# \}#JgH᭧3,L|=KjPG|Z8ăv?;)W]y [f?T5=cVfӬԊ')`bAzeG*持Az=qKn"#-KT&I4Po!cuF B%msRn"#s,< h簮h wnkI-gM2s9J5Ϩj*sPȼT^E&/Nue(߯S:_/)?$B~Og(.󏮋9\-2m%vn`c1B)ǂH9Di*qUYI+\* .t涮ӫh# ?{ARۻU?zHNvcYK5&IYF+YXE@ V:l1ID"D 22 Fra6;OsSr+ ǩ䜷>N])h%l,Z6PjfsJkzfҕTVj(;KJ]Z'8N]}/!7l^smSnpw@bqÌ q߷T:T\EEHEߥ/?/HUHaхh;a ]Yx߉s^FQ6dt0~ )rowdxm.$ 0 R\O3YWa7յs#h62^5ۛZI2qG:&i<2ʪg>}$ސeE91'qVrdMYZ AbR)]q&:!{y*)VTHaAoQ-/^@Uhl&(^"{K`p>k,*!c2&RgGL*C  ߧ(mf:ù-Grp= j%6@$8|K@i)?pb֖x-)į./֞\?+P+(D(I=d^g ؛9b.}gz_~N/NW>{u^#"j.sg_3 ;r@@e3幟lLrJ"XtHrxc٩ WOyX":0L("tY+H,bΨj?/YVk>tnnRƣW{ZU]>:r@!xCkNzhČ:w=1ڡ5+2zZӊJ;>SƧ2Mhw*L7A;f&oOZkGSnZ5Q}6S`ڂӜ8?UZ'Vέ_[YOfɟWܯ%/x7[==:1ofB[f /# Heug2}̈Z-ƵNW%:'`[xJHn^$cGqo]0N/ UL p1FcZG%+ݻB IFλUwά@"|Eá|;Zhth՝8ũ ̜Ъ@œI"jr eCNWO,$ں!0.d4 \u%zΘp ˍ_~W(֨m% 2IWE61 K 9Qe[p.7ȗB?m"Y!% )̐T,,qSU6/y+~ h! p{SY9U2S5Hw0aNR-uu: uBJ0Dj^1 ZX+O@P-p$T"Fz B9s.>zѭad$(+)1:PmhdT `NԻS-E#yVDI3uи"Q'A2 GKoA#V[QyL v)JMY)#ǙqbposЁjE%掳1OdARTAXCEsFGh3."3Ns<;nqj,:?QƍMn:K:W]\ma$o.Dt YFWW|QҼ>Yzt_bDޗA6f7w''g7o(ScS~3[͡jߌcثg@*oc`VשrBQH(^Jm|Ww?{'XVdTJxI$"*dd"eUF6V"Iyڑ3N&5sլBBuOv=)ӈ%mF_ǵLǙ D\'lngT #wAO.jw?p+9AIhExHY*,D^\X(Q+TCC\\K+!Kչ \4e.W f0(yp,gvSdcx--/H6Ox~/'0}{4;^|S&$;ZV;)rO)T%̔J.* L̓,Q }%ӜkM AuT 齾Ҕ=F<0챷@dg E 0T6Br%hĽ#骼+JoqJ* W hꛤ(Q/yͳJm{)h Vh\o"lx }'va|u^eT*8|կqű4\Z1|=CunH8 y9 p-է?4Ʌ1 vqC4?yHE6oPYoNɨ6:k.31N.o+wwq l.ϳ]7r1F{2n⁂R$oT^aQ8PB HF #L~ܼ5pJ8:w(NGy Y/Ǹ|::o:U? Kioڧ=׌93]22$Oȹ.MH*mE ;c|^(ݳ]G^:HQJQ Шc\"0 ")AULLVDp! !W(hBV e. \T(PcbR#xD""$b;(ljHNHτE;cҦT˦ oru_iO*)s EYOZ su8>&pɢys5*b(xxYQ wݠHRf[$)Ij4[V,H&>:O;ZadoMuJ( n4gOJ/t- c!!JA;l],!kӠ󒼹AM+㦕!E3"$#DŽڿa}ىc.e_߉n(f:E2ݱ) M9OJp-v[tၨPc5||9}۹rx}/_}a3gOWwZyM Fr=ӮN]owzz#nBсjt |! JSqlj)1*Ȯhbzu Ĵ{Sbڇ|&ɦ! q)*ˇʑiv,Icט s&M(HVLxdS ^XaNsQ   54\#("#^vHPx`-[K/\p{,]%CeqQJPpiМ>JiT „dhEqAfZ GI+NK;!6Bnդyx7ʿcH2] O:sׅ9E/_in?3vdkQN$Kd0`UHpaǂg4O** &Me$#CnLJ6༙&3浩`˦mH6`{]&(j,iBZH&~/a&r?We:#2W[k҆а@D:oYIQg]d>9$yQы?m#Y^%ѯI7Auҙ!C@X 8$!8[Ez)$G-s ):: ~w$I}F`eOkmӏM)ן4dzow$v!gآg?J?<9=6fTF)[Cۖ$JwthNR>+w~H/8]$"->\0J` kpẉ3Kl9G 6$UNNSAݨ,rϞPxu/aЩJI{ \J/U.q FdL^yt$&_Ihe0+Tƒ&ƌs&Q̭mT=4_K[PdF?CZ)GTK"_[iaAnkt -u,8**PW. 6< mJ`0~+e =TYM aF;0\Eq9do'}&]_Ϧ3|eeKηrqc/J^$Bβ] ١p1&B)Ƈ%Y;Uoo20çK6!`FFO20ډH1ȧӏէe<'i]3P&9bKʆ~8C*o+*8׷_uc㕫f՗nKlUlv V 5xny^޵ZhQՔݲoTٻjʨIF]>>w=zzҔ(9u8Bv9:KXAĺn>T@OE`‘@ŸPyic\SHMS`T.3*vDZ; K]'ԟQg@PՔ&6{xt#EO~p)WSsN^vZVs}'4`w5צOޙjtґn즣fhfg"Zq+K4#/ts滳%ײoL=TMIn2/U_89ƾGfC^g VpfW 7/N;:.9jo* ) C WsQS8QTT )L$|bVj;vr㺩UC_A)9]ƿ:M EghEz]suiQЌ_:9VCa ġTɞ$_id$׉7 J1t`We(#_d9[PҀ=#-,n5ӄ4桨 ?׌*YOɨ65Qfl<\mtjx l٪ϳU7z1(ɹVFg$S(,0p<~Fm0y_q:L ]_t:Z7+ݯej1"\/_j]Pc/'Fr} %3g3O'Yf'# ޲BYmAy"IOND%M#]$k@qws/=R=X pGnl(k{?[6VpVH>?cgs , sw_$ #LVU ׻-BID~Gs]S`o.JƌPA`d47j-90d<>%Fᧂ[7*Wg̓ gb$*m]7A7@+q(.˦(ʲE!0GM,eB\.;YvVVJl+J̾T yVU%%j/Juqn.e/@uy3>ֱep_Wшk5 | 1]8Lgh,rE5j{A~|lL{0{*qSa8qP}k7` 6cA3;0u"Ԑ dQ)v¼d^p7ƽ&2_EC.eeeq3ogd>Lͼ ,BZ7FˍwI̗)3*IH(YB+ qlgd 3V|^b>ʲxk^8X@SH?g1U[t6q83yeV0j,:8v:?_n~u$^)1J^mbJMBjH%bSMƈ.Q62P+Eqґ3[StF )xPR Aj+䳁*?VȎT 9%aH%U*Vʬ蜃w>D>+1'dGAq)+VU:(Ykd'x XB`P8Q$f&:/T 1elICYńp ڈN<oSR6N RiAGFb2vJRkcwR%`yWR!2+qz~gqu!5y;Dx݇vEtVwi#YkRBhyR0vsdoލ$.U'(ח&pP,9&s@vH|蠫g=]Q_xt.nII!A{[O所ľAL_ ;Iуx! oώ7ksV)wP7]r0%"gV\_M4:do Hxruý-Ww0\h;dF={ܿ9j4VR l؟v0{ukO'<4iؼy?V8'LJ8f5)yվ .:yXɮ2xwK6^UIt$[`Yjs`2Sh];r:_u>wF [^_O1}zxASuHkErӶruG(F1?7 'ibg2 mKZ(Œx^l>!o]W$;PwbTxmAW2>ޅuf gY/g?|eS\m3TJzvD vB)~oKJëʺT7eiu؞[gS*x&NQi?SXKOy{Ϸr]9!_ӆ.vHۂ{9~_}إ=gVvk%>7ɯ7%'riOSS1ԗVVNV[\OaFzcl7z] ~E't f}G3v=n<7 ڏ"?~-76/dZ}xO/!- zYa?nfg ȣ8L4``V&+olH m[tLqh~Mtbz;.1@N`rph*sJct"_JUD1%5@ːF7#9ID5IKQz7 %5HH#9yS;+i9I߃ŬwJLA; [}\P#4NܜIFS_zts7sY4b'5m־5FVRKn$='Z30aɓ3YG#4Rg% :6qӧu7(.J/7ۥy DuXPbmW -<$NBj~lCR#0Bʕ"QІ$<!fG3q'q$>RFK 0{}g'7o:gʶϵp Ix]Lֽ% 8I0f&#Ѝ{2Onu/%J(vzK RlLjv.j߉_(~fRqJ}r 6T=LRQj-R1Fq /J P;9T甭誎"U)A9D+[, &mwNsf$`C]q-ObNT d9I]5E2"3(v# QkH:p^ʺ4yEh8g`1_m,5Ӯ5|Y M}1s-fꘁJQmٝ>.jڦo~WiG{/n,| S*jkv[! 8mq ZJ`=[[r\ڤ+I&,1M^2ĉ` m܂901g=ev&țwe`3jdgFxuf`f`W0}_u؃/oj jqh]49<@WWWK zzOYbVN+:35}o:O~.ff;c^?SOb8H'o4kZِ׾6m\QۓMxs9m];?~q '{Phu~ T}=xׇ{` Q0i yntj?SOV@O+a8.ם0Km43 pϕ;W *O@4\M8Y5R/ݒmw9p# 7/lۇrd;Oz7; &gqzEz#ZJJ6jD==0-xXlLo_z xٰnȝ=L> y04tg^ &oΤ@JnB|:VK;N=^[=pd{ o,FT/]j%U;KOQ4y*s*Vg45Ɲ ۃ8CmlR%~q\Xk#G@_|K% mr!H6iONt3# _g,$tC#ŪbX_EGr81Fq-zpqz7'$GӾܣČ{ž4 -.c/FeD/13Ϝu(A8d0@z`cM0ۍh'ޚ]]/k/,D-W~rTI`t[+$yH>ގj;^h& u^.\ _&Wl q(k-pЂRex>>=.g.:u?Lm/N@]"<ܘKK鱙ܤ{Xٔ|EV4JU?|\H-o|[2' .N$~!sNֲIUp0"Pk>>+Axy.k0L"H 8tiE=Y2R<ԕe\TA dv07[G֏nQdupڽvbSO9nh%x3{ŝ(,ULU ae 蕶JOv#с`ĮG,>BMso RtSnjVUґW Tiz6۲3 r{a[`GX3l`>Cj̏e_9,Q2RpieAPQ5 ;ZuP{w< {MN9@wǓaq:I7_pnݺK8oG\B;xL.cQpb6"te޳5/KnMvVDnt Md$>b%0)\J ܎CI߆^N@ufZ 7gaSI˺Rg~וnm;vI4ыq|3͍ui/|*?zŜXWNHY)adEGT V ucρ}8 bIYa^ɶ(+HXpvEܒ:ǵ gje'Ey:̼WY3A| 9߽QӹNŎb.'< r VBf~]便N>4|Z>8'r fMJ*+X Ϣ )z=^o'r79 oc?y)@ BS"bYoȕQv^fKr2cIY\!$g)BNdZ$ y}nv1hL1cƸQ2b=$ɥ1[/C! (%הM4FP4#Mӝa֗r+sMvØI`һ֨'Wx}r'N}s,UT|n4|+jh 4xo՛DCǫTc9TX*[ϝPF 1nf3ph ͐>}ƉQ h A5Iݳ͐E8'BYijڟA:Bb 24GST.TD|-GH<9e %~-ܗeq;˖zKB?zF׆kbL!*u9OD@&bqډ>Du6ܴ̌ڡAH}IȄsQs%X7vfj_NPx4ܼ=q~^w:Qj:%1'`h_;?68Q@ 2&3F2KU{iq;Z?y@ZH8V|Ŀߔ&Dcg/(pY"l|@ oSb̤eJ[~ذNX~Tɉ|uI3{.*G}ezo΀LC \ˈ0:d͒V!Ox,K2HPr Fh5i:g &:*! l`l5<,ZV[)S6=Fu5Gd%R/ta4F-*NZ]̷À. I`5=xmb *UTZ:h~o ;L@Mn.hҒ%yh"Z\ 2h5&!@$lHBK3fʂ8 ߂.݇"Crb dDžxi2 |冭P;VPKaȭ_%oص4;LA zݧN,ݪvS঩[2unZwH 醴>b|}{f͕5ԧw#!~XB͇~=3LAL{2n0 57 3‚bcɵ|T^مj@."`!2x1:[x,X]\r&f %QU3i=mǐ =PckAӦ1qXeq?$[g%J: -Џ_ػij!֖beXqoSD/gk 9ͥKs?'cnS03]ގ[d6+H 6@ɦ͛DkVqR|0JuL6OXq[KV_?yyYf^Ͷ`=Qo9?ԃgк|,=yp?V8<|Jj @y..%g w_xǣOn1-o!-#@$!T~?}t)n'1>peP )5Wgv\R\'+2W1 ;S6'pnoFvpL M4!ѱNj{2L8tPXћ!'uX)'k0ܕ /|*n|*Gt(\wMx֭/JT]k4ں+nChWtʜǹara9X<(QctnQU9&tJIY::i!$ܒ"/':0䁃i'XfI5}UX׏+CLo7wz# Xʧ,Qvlhc(?&V|p8}_Sզ1Ɨzҧ߱ROZf=ikW;}ݦ,O1~˸0g{mM6zdz#_Y,oEV-'g1?B\J]893znd%v׍[}{/]ËդUCCrz͔cféUk1jUfxT6G+eN}Lyk#U֤,wԗOOLOO짰;9l_ǚ \6JJޣZtjIGq ~m ]gV(fec5J!:"XKe~8Ϥpty_nqTv% 7ُ.( V޹Toj>9J |{יXqx ciCVw7[] ؍P5A^x ./Ow?_?Ff'Ra%O@4lB WKuI1ZV+PIuO0\1 ]~hק"/t@i௝\6't vF2ǤDǙ\wJ#5HbmRQJ T0mѼ3 d)RK%s3Xt Ӿ&-2Y|h#v%0FH ZFSg]Ӡ&/\)!,k:xp]k-Dj"'=BB~}ړjO)Kt4T:~{;~׈EUh2UC(fn[ RB9tS;;+jDw;11K!#-Leh(Rp>\Lӽ^ x^ORtx4B,-J9QJ Ky^OR}d yv[h_; 7m}'ʹ9)QNZIQyb 3.ܪxB@5jt`=?1Ĝ|ţxW: 8ih:c7BK!v亓J6u@+QHm5oltg96<%dԢkX+%pN4!삇KMܰR3مejݴ.d!H6mbm} ТaBInGjӤzʡZyR̐i5[IB-[7 -F/Dši ')90KC2-yTQ H& շL#E2& vG]i0$b#K+wwuj3oTiE74ިOhWd45c2Í /X?N[ljNŨ/hXFȜJ| Ut+ϻq9>+s5q5˰pD+4@VM4NYKfb[kTP|TbJB jUZ%X|s/w>Ӯ\H* U4TPp%T9(uO)bBsqߩJ~=<3N1Άp%bSVNɯ+:81bmfE?--7mMTBRaʕ:)3THN )a;Se-dW^BB}:f?W3|س]pcCrfY4?EF7)ɤol>rD쥫yQo%y/ K8ނkߚTHZ͖<8LK^ ,vbdZeMcup] D- 1'k^qBmQ1 yDd @ n?'n(1"aL^D$"u)ʯ2O3cB2b9IiOׁ6 ºO.oT:iH0j/,d,OѱbճQٻ.,s% ŝ.(+w_vZW1CY')H|Pz{ʹ%q |D`2G .GMcmCBf|:Mk C& 2$TZJppzJ} \k:׎ nXw{yeVxfr?>P1r ̳lTYv\ u:M{aܟ3T)d֭k_%~&J$SIʊEsNK8@2fR˝)3)9<Ėjwc"Ky|\J V.lB]4u hyk\lӘv6!VLoOWT^ӏu!q>?H?tHro| bq ( rJ\P>=se,yd&^ Geif؀K!ёN"D$Vbۗ_)jh ()Q}y+;5B*o[QE u\̲( o΢w-xNL`S"^1 ?:RHRҷ1 skB hXKJ!k6k5{ۖn[&IAR&:>hbAf\D%w/b|^yяڻ˛Eg$8ۀciEq3އ6㚵 Ǭ#kIxSǕ>n]/w.}6avxz)}k!滇]+ˮZ^̦}8' }8#("]*$klWB#vS(l˅b`9xG ;JZvړ';p~ k:Cil؜qqsueE\ g$d(,ZKN"1)~zHLK@béHUYE{TM:XrgAiH4IJu%ψ@bJ#3NU1đ:x olޫg?Mn]秿%I,"O.I~\1p嚸zǸ<WfT E_XEBou?pui?l>!C>l~4l`wx^d:_ϕN/LhN\MJ$x,I%:i[zŜidf%FcjJ9FQPL>F}77%fpF}-FzGQ\N)/iz=}S$bsHצ席KB Yl)+,8F̠Y5VͼF[MfZ-@˰2\}pݺls{Z]rFM~RJkmL^2D[3LJ^Wiކ:W%^2w\7|;H]Q>dM[ SJ! z N@@3N(8 kda|2 I+fѦ/Wn06}edEp˪53F6V{Ι\ u -^t nf)(5?gQ|*%!ODsgu?H/5F==J0a|"E?(u9V"8oD)5u8߅k$aZ(xA%K?TR/N|$9_JT"a@to%ȺmM9y+89OBzf *Q1# -ݺ<1oѨqkT[2k~y<޹t7UA$!7yK++p-窅&zy֙;H RW$p[1pg uOKp'* *^jePvQ]~-@">_b~mh{3bOu4?aDA'd m!$= ZQ $ d DВҌS-εcCKr{;5rBpT?2,ڑif`J&/mF p,@kW\v9~^ZU^]-Zw&js_1eXM I g)n2s zyAohr< LYrmSR7>=#OvPbо.;H|#T?{GUT \w7׋Dc JoEBdjp&X0j@ J1ӏ^,*LkUQLi Ϋ\,H}fjuiύyH- rkW;󝭾N[vkE^2 gTwkx#3axDA ae^)(Lc׊/o1Z4Z*t|׊lћ~, v^)yCc;-Ex-]a'CR0A(. TgTa~8w{A[ľ?P'5[%x+H -۟>*ƴ#f?h?WȊ?DnLl'?1>/4Txo1Dς;j QTõM.LY~ź~]9 C;K烓!4}PsV@ kĦO9 /]\q}KG:.tP?i=LѸo'>+ TN j|w?mD•">Ҡ*QTy  U2?OE&}Lڙ3I?!Ho0yLcn7X2SL)5e U*D JQMYZu?©\6Fpo2{>lu< C"JEc_bI1b`)0 snřEOVk:d{׍$߮s]zawv8g ]nFs5/C #ٽ=y * j43g|; #Dgd ~텝XNlVU~͟ h0 /x}܁Ǥ[0C}ӵ?&.ױ]s2 "ls^KջL|sCP*!x$ת*`n֦ىj < C k ǺR#cBUj\@0w+Fm|Nf^yfʊlBG&>~q8d-_U񆃏՗Ѥ^ߠhQK<^'kY:زe2knbQʔցmlck/:9ȹiߦ|l wq/z6qq}Y'䨔092߳k:di}uXh=u󭃺O-DSZ4,wj?|=P?7~^%/m;@S~}wS%8%OZ4m(Ýb;pжCiC´.`ot'ZGqL#D$St]< Niݷ3w|7W%u2ZE5Ѕ:b{&v]zⲶz+%rd Zj;7=j Aw2*]=TO1Ns@=㍄Uk݁Zadg)FYĢ2H*F%ש&eQb0/aXdUvtN *tpϾ!b\c.{w[Q|d}YK}Tagd|tY)l~Q .'qsk$Sp=5E+W1ۏ?6ї9u!I۽[Y ljyV .YmL fc>v|ʾ}hYOlYdCG Ġ̮(+YqO;KEX$Mf|2^n7;y`J"O1= UqyڜPY' ߞ6^:QNE )wE&5`ΐPNf+J{uJàIvY\tX@TqO Kd:\E)T *dQkX$MxS pRg 5bHeV=.m>LZ2I Ӏ !hކfKKu  H"TgLG@) )_h(Y@+ci>P?RlvlD{K=Fad7{lBBD#IEMQ*D1Neʔq4TȄr;)6~ɐq&J7JqF/ nGJyic~) DB+Xęf) 2abZ q )ĂęDSǿG˷ԼA5TiJksێP"[cʮaщ;[{HU^t%z,O MMB^,b4R)pUZ,UaN3O!PW=*Je;Hc!#ؽԿ+2$Tc# }\>*̄ {B [?J`:mIt+*o&XX&A:*URb!шXG,""c QeKNo..x/M_:\rya@00abDb2ATD^y-Kb{5`&unЦlbՑj9RhJ=3deVO^q̭s mS![*,ۺEQڥ[W=p)Zz=}bz-^oW}^ 1z -͉Dy(so?8bU!Dqh r KW]%tO k&*Dz ڈsƧՎ)h H2rh2zT :@-DtRKRu Pq/]UeݻZc/ꐂ+ T6,ɧ~WYz~A*G纞xAԤHמB8iŠ9~&)t9%Vڬ+,]k݁}JaԘktLSQl)m4P;V'pwǍ.i;rB :CRQve_ rrdA ?Xvc6i91C8mN.PZBڸ 2c-J>}$Y# üM|^HxH2c|'ߓf{W 29"y1*onxu쫶UkW#D}ɹRX-m"0(Ԙbn,UGȉlŞG-! @2L-~BE֏Z}8mȕPg&:]_ch݀X2aFHa " @uJNe %$8A47"Ne I,4sw{5x?ORVF nCWyyC̆ܕ3yk^,oLѯ`ndNfDw;[F]щ;A>sO}Z-Ջukliq:!ĕBg\DgIa% Y8T [ &!9h{"ZE1$MH\ ԗX1*+S2̅VCı;ďX1Gî6bj QOb\1H!L0D@xo1vwT ![ hihX(|p㥝}&f@bDL;TfUC2CL:PTZf?J'.dQ;m<,gcWh&巇hkU%aA Ư8R ׹buűlG'v)x J"Ũ.OVs{UWǻw46# yp<6l'oqf7W(Ԟ6^\JJ p|1`GuA?T J?_UF24@)oKڰi_Oj UAh| R-L@X.Y6N~VsU=6{?p>P_;2sX"2òd05F2QqFc%R#)#I}SZ>u?ᰏ-OⶕIgp)S,P,(©1I*pbqBKtbeoLj'$ͬd[E|tw]]m{:|rpUwxFOrNB:H5mRL[%a \IRDj46a<Ӡm_J%kŀ'j"l((M1qUjw|VI 7#CXyYFBĜTIKɓ!cp bib$3Q"Iqu(/ti4bw!B w[ɥ?y'{FGrF|Oޭ1qkD>yECNb{R`><$[AJG襤]/E רCN]ȉ4zA l rk7x \^4kV^t1(jRb~km:}Xo7O疒!ֆXC&Dn} IQ)^(à̓jscf ρ]V63:}5~(Z, ͗][s8+*lW):3dg%\ ƚ-.L?)Y.H(Jlk4 jHp'^Lx6~Eθ@ݟƠExCDN$lM&(ZxRlwPudP^+ն?çC^RgϷ@c61;G8 Ǭ@1ٳgNb\(tG) nrTREsjilS87)/((# R}DE5BtwJH@ʥ\A%ڷ\|I2da`Sy9T%Ԝ.Z^k '>UTK..e*ܓxO5Dv+KTlK)l"۶ DvysZt;l}21~$osvͧ`|']S?\8,W,VO؏niy>\>UpkƘHZPMǘbn.*Pb$"DJ 1XvlB~f Ul:u96g mWJ VR/{VY Nqfm$\QjWp*\M&s5Tqٌc (/^bUh eKSX'H8>PXSj=}cjF@VBbKqJ%&F2HV"Pdu[9TZx]Q$|9|¾TTfk`H0hTɖvjįE!X}K"M˾L}+u&Hg/"焵7zןxwG꫱ þGr29UQQvCOCCKku1}B;ܽ{E} I^α|>^ b؇εAϞA^tSO+˿=/f1 "O#Ume)Gڔ@lj] ORFcZ 7E}*+tWJ5Qfo5l/ʂJ'KY˽|0w!•#kCyJ֌D,V^t=! ; .b'JM?+kRy`7wޤ Hᐺ5qWH>X=Z.'[.!$8RߚIjwg$,UT,cp+R[<.ý1+w# 'yLdvdls*˜f(3Ⱦsåpv-5m.?;.XT=H=-'/v,V Z.J5Om.M:Dm}LA#2 %gu~>~Tsʕ1VS %^ nC Dk^;eD!XEmD qL$֬|5Ww܄=ҥWIplz]fB:Ix@`EndI"!}ϛKO~ Q ?!QJ;W+YQ$u׽k,QERb p_%Hqp,h̹H{{"U !E4P*Q~ ĤLh vHע@"RnGl.[weYneuFuVҔ_(yz% !2}v[8P %?Cwm>D/ _Anasn@}Gb@QѴ?n8H'j·uv߱fɊ&Ej~s),`-GƳaH-*\0heF;҇[fv۪FbB% /Uqŵ"AЪwlZKۚ!qV9' ]vk<5ѦvEҟNfY`VAH i&%1HJnyqy|C j\Cin_fH\I K~}$ #0QPD L0YBuN\J[kgA)MIXIr+ҎF.FC [3dͲB.Qr:jaC"2jj Yh69W߯90jBu(ljQ0Dh;xD;%O2@(LNC ;+l>dto#m+ZDW﷛{ɺw{ɺ/n6#ċ,BrmuV( 2&VX4q&S*P*c T+c(f;~>f(܀?ow}{֜5E7Shyp˳fNiz0-f'I+X^fT'h(7sV|^?UTUUULQ4YhJ!R)TFIb FeD%~,@3د)#L+_QVͩR|S/m*0`RƖI0.aI@X eHLUAs+Djcd=eGb|fv٣BƛbvE;!򕨻&fRL& Dz(bv.Dx'{ϔڴ6f>j:SEg3e-:** AhN~P-e:X* x gHf(>rQĔP>XÔ1)Zji0Rc51H J"X\9X,?(o5 x`!į,RRB6*W9*8E'RYyR|w] MRL0 U/LJElUZh%F{ ULm(? ,C 6<](7ATI}(7nzVkݯNc$ ck0L"EdOil 1ֱ:\;SR,U}^sF]r`玭9};yFdo ;WV}H,l4HvtVຢTD;ۯGu4poa3W %yX5}UPv'\@=#\,R\(WfC}j.Dj"52M)g(yjVZ+ǛWy3x泺F#Y Q: ;U RJYqN~ʾtX?T[Cae ,tT,:G#)h'G kԜ;u Kp\{]/hX1 c*c@D-c թ 1 j^/ :Sc8p`yFT8RA'QHK EiBb ̀X$% *5D/TT x!V.2Pa"BJ^ )b(jLe5!JjIN0w!̉#[YP".z^B"/+%EL\%QM'|]&Npa $Ppc.0hL0i! L8I,U6\:thh6dӡ-Kc?c;H E3-N'bwiﰒ0D;l| IO$1& 1B 6ł $9XTz4J@:!RT4ۂ\uA#N) vHz6B՚HI٢5zNju5@7t*a(p(#JC./vJMT*۞#Kفj"~6]~SM_VyU'zfAm1xDupN0jpH "\GZ +Dpam t~S9i7|T.1g-7j[OhfCW|;\+kc>+KK2dz4ɫ\Ϋ,OÛ,߿= >}Ǯ 2zCUolôpd a* ˕eخ~Y®z6ړ>-T>*풻! [+Oք|*Z)^FvY`,B k,LTk\mpS*QW&UHn}c̄ւ*E405 Ð+5SSS Ltg-'({]gŐr'O+0lXƏ*~m/O)B-ǟ<)Kғo?LWgn 2Z/`^׫߬4|X71D|_x#فd+knFE( mLa{\XMR=nGHIS(!nXD%L$ۋ^?Q?D*g|eCY]&Ke5o=-է|bXN&(JA9Ea(.ǭȝ#.o%"s94u,h409XPKE>S2Y&Ee|8!,-TsCj1kEaAUu|(y`(I,)D+KJʸEJzUm%,FXљ'ԡ.,6Ssަdd 7`ƄR̉X7eZ MAH>X7Y0%̖;F҉<:/7iff賔xYۏ?|?3q|S) S^;2DeϜ}[DW?r?^m ZtTETZZT,izhgkp:?VG PAefqfBʍT3P!hj))L:UQNjWQݚT_oQ  6˔RViYEN-=/?L@/(@g$ mӁNr]&?|׮$"|u lc8NP: 昧qsbbW$H(=[=<7|B70J*j.Z 6k2jÎl;ی0 ThQo7e0GPTǀ?mHh VԮt0Κ{f֜xk%gRd"-&!%Wn)S'e"ӽy$k-o#|,I>`ӕb˫5+.6+i2W3}߬" g&nJ0+}â= ޫ $Գo{w* @t^P@{(M,^kܬy[} &S`#FGa1݈Qp"g(J.64`DkiDQWLf秴q%*$!pj"ѥL IX?L>„i*@~&|BFDrGg[2R#xL'ҵ#K"ڹUF;"{hCuI^1!ve_^RoTf(& LBڕ)Pį[8m& i/$P;ӄiWbA)r5e4#'L,e\?gjw{2!tEXO~܌a&17ThK3'$C`Z{s{G RWx&f\>gW#έ$v^)pPeiR)shoV7 9ת[k=V;r ђVOEiV}{h`In.(  P,ЈP(m!=eR!O3 ʐ -=@ zj4Ns0ifpR $KRNDTH1r^jVupKyMgVyJ21,UNQRtjz`DgOBVE5Buֺ:OJQ')UTΓҊj.x1򜥔e*8S 2s^WT F(-:SJr:9.L%-2Sm-4z,da]农KjLYE(%3iLi!2UZz#V20mPþC,дbu.?Eӻ‡çB]ʚ?Gsd󎧣1jGYyoe5e9N/.ay*;_sҼW>Y٧.9]?药O1f g Mmчrʜ{/c.9'ۥ=t#{|e뮲ޗpCKZd̒}̒ӃcN2K!*5e:e=,w1CB*O gm//O ˫IS'Յ7[)<ͩ$G1Tw<w9hȨG!5sN#+Ji:*_̢,#BoYjh|(%KSiTPJYoGpٵer$?+&w۪IeLJ&{u)ޤZQiU8 pj(qNhzPtZïǝ"ж[PUd38eAJScL\pTV"겚 4NV GomzkCrmk&>%/p*֨.U`b! ްH\ 8L]ЎZJË+m4[ЫIuR|gwzp^C]@f]1Y!pR (2VpJe6‹y:n9x ]h#p6 ea#FSC6/UFCʁrO/OY߅;3m +&@1'}0bPe튋> YyǗ`Á>!# Ϭ65@N"D1_{ESS{ѩ5-Dm`_][7C v Btp9zB:c '3(6w !Wlxx - B>co!ݯ1j'hB@Ǡ~~("{1oҵBW? 1%tps t fxf3t0. 4̈́tGR;tN"dNR$zj伻4R. c's;pFJv7~Y!w"򡪦uVcWkݘ篮]}X^v,L65z%`G%8/%4,0**uRQB<czL,5-fLhnckϺ'ShBxZM I_j'NzM<0hx? G?ܥn! E3.*4>F˱;0Eo_ܽ (LRxyqkYu:8TTl{<5 >AtaXibƫ %%I2 f>|`%{w,C2O A&gcg[TG7:Kg,aρ9($3]j!<)|suCT_oR-V,6=7R@͖]}93WqP0+ n!Z/2dOB r\½U^込\#znjvd/I񍋺a&nhCV]M%N]~d:FJu}P #ea&ItYɊ=xL&Xt6c(.ѫU\=<,˘Fd0v40<5(LFl†S!k-N*j,AtQ  6%h͘LQG V33c~  -$^`׃͑?GHgGM*$89=(1L,UYYa=qvw^O)d`֠،[ 1(Ҽ`}z*I$n'1_B)YjBSDFbA3e$ϕ[pu%Q/9>a[B=q=DZ.o7>#mXLWZ1D74IgϽZ4SAm xy.qsx3FjWzo}<n/^u3NY6^IFrЂvsPeiձs98o@1^[9]ōg*/-UYؿub◭A>Niw"Xf,' juW4#W`Pl9 SQHikZ@;/(VSkh˫fZ^^}xp3ݛ3''8owB~M9O!3 d"J', 8E5E r%1U,3ŝ`J%JM1jL; :Dժ֯uԭs-跔R&󤴢Zp>^9o)(# OW TG9Rw&Jbzk\,:/-wכrTFw6^nlk7!wT $vo0S=hg4[= VKsvKp,n[.>q7ye( d:qBB qԻyU I 3Oѐ]_wg;(5Ǚh~N".1 Ddц}0PRhQW:5BZnjG:i-ռrI4#='A&zT6ڙ' &M7ڙ'iQ;jePNJQ=EfJƅ:$w(J$9XÌ ֳUZJkV 9TyZL e(W<#9i9"A*!&2rVL h͙ xH4 "2K'5U9ZxizL JbSfe(Ӆ0Rd7-k= (n:!@5a$េ:*4nQxM xKZI40փ^6<($OWw]]FTh/D׶񅟴hj"py&D*³/D͗ʼn c`f~D;SCE]YIFd!9KYo<3'+qMb=+3Uh]Lg6eJϣiJw((|e_~ç&18.%%f/iߦd{EzZe-)*#e,|3_>U^DwaxX,D>.g>c,|D[\\\F#EJ7xm<|}R\6f  Jb)+׀!8~? UU@;(90X0ň9HFzu$SUdqz/B1B-ˍE!>9rfݏn .ʌBƘ̤qϫ zl^ss %(#Ri2rFSS!EA RЋLT[:^<^)qc)Z~cYJv@1*SteFmtiF -iJnᑬ JbYc KnP1zGVc]Pd/`5.F4tJ/c5esmRIMLcf)>fJI9Zs[ <ilac`조)8wW~&yr ՜"^ +  )S9OLA2)+$JPSB RP#p^<=R_M͎A9i[ -ظb YHZ:4Qֿk|H>D4UPX@:0rjjI)$;#w>汐纀*Gtmv↙5 -n7h;wg1Kn^\DgS9wl"b0qiŲ0d]H*U7!́;"]YJvVnj1TY;P(٬Dom !˝ʻnST%mMĦAIgӎ c>ՂaO 'cnAЋ*jJ<5{dY 02%mrQ%K_VK:SQ 5Ez^,1s*E>=8zXpsjk^R>{g_)X1_j] DocamFutTR=}Q bMߤQ:aSPͩLkgSǪ=.w֤rtgĪ=>l!>սZ\?g޻ݬ/ܬ{ͮ+`Ƀy{?&7cNJX|S)BJ!.kI,[J10ץRQH)/wB?s) #G=@ܯcNdXI"Q 'ww\`,Pi$eV™PS PN)#Bj33B'Zcb#G+'7_Ě۫ͷ+p_fUr+sIħ.)ٺsIOJ.ڛPN}" !=h\*˗__C6_8,'3.}\ϩnVÌ{6ne~]-{fx~*FAD#tW-f|g1u0(~$~F)̮ w&+/5_YǛ{Sl\i9 qnۗCI{D@s>8r&/{ z84rMFKٔw܂.: {!@]3]DttC< abzFWB0P^%cZD%}1KGJ3*dZW!(@ղL7PO\wz1}Kde 2Bf u.Z4;|GXrQi9> 8tkN2B{q+SKh.u)@$'C݈t΀wGkk)s4f83;ܨKTvȥ>#:piN#ugZ\Ҭݾ8 AtJ}>xQ)B)[ Y3Ark@BB2FÀoL "͕2,7L3u@E '2uԥnSb&x6Sb ~ZV>|8Q}$z{e}gj"rtNήq˥ʓIPr=$',g$6u%A12Aq!Qp{Ax^H)G !6 ;.hV)qWt5fhIH׆Cz7 37H `E%ēAT*TɎ<;ɀuٲV=1OjmNae|XJ gk 5dQ]`6hW??/ATZ ٨F%mƳ17hޯj=]ɉγM!phγ/=8qiU v|1,pFlJ9RFÔJN*91MN)׮,pMr2VP!@،,%in3 r%, VJ %^ ċd*0Un~P0H3V4ЦLh*P`*#Zfro|>ϯDJ|YÿnyzbE;lv(O?޿JV0?9%zY–ʷu&’#qM~7bX6t7Ϯ?OSߞ\3#/ 9M֓Ԕ䚻ӛ͆v;>X4%T2}}Hk`}C_/0V%A5GsN;UBYX&ɗ w>p/(X,붎I@uf|MA+X77$ObEv8(2q\d#ˏ?yT& ̉T2+LT&ϔ+F bjY,4O3 '|,QxKN_I]!؁N|̈́ au甁G37?1'_le53у7hwUR#;^EM Ǵ( L\I1%G@Em-Z  #EPJL95It x=Rtf%ww>p4-w8c xIkQ{2h-n ~9w$0<v9ֵ2K^cwT7>M5"t|UTjri`bpӛSaqp*.k2Zb⁈ R~-#~Gu9!iL"v>Dgo^%.H$Z"H5l]2`9ey4>JMx~nSqg/7'fV)rL'IPLT" `)wo., }yp;EO @VCހrj!y߀F-_7R&Gf##t08\l2YL>fE :4 &UImBm#BGZ7KJbP A5[ 0%%6^OBzq5f*.Ľ&ذB!R-@ *\kyAӜ$)JJ[n^ZmrEZ͙~z7ZAFhh%BdZ.:Z#9_Žɥ]DhE.jrFp {bIqUL:Kx YZЖ%`D fiMj5tVFh,dtJBTHԘ" 4I* ɚ-qJLMРH{yTQ1!%]rU-m-8hrȴMٻ6r$0w;k|?C.,.MFdI߯ݒZ/ݒ QbUUl%sav `؛AvlɬUt9m.*19? =x"pȏ봁D_Ott1r'T(1r>tn.ӁLJ=%J?z*tߞW2 aU^th\ao5ERsse0LcbCN&? Ef6?#f2-RψGl|xf+Jߣ}sh =&.܀;y|[V}`>IB捷h_oF=(#:|(1ݫ(WDTV_>'[бWpdspeIb8 J-vxP/Z/C*$VQ^#6n]ƕ:%+mB 8D/qH| BЧd#ԧ"G}$VgB#/LZN`x2zxYz&3MAd\OAL&/+ō7~%fЙqglc&B, ;2VuGq7& k,ͳ#͚g9˟A~y6 '*ZrA;)6R)E>G9y6:k=<O{BdCMž>y0WrOl6J쁄ɀ'k@AmQt2*Fk8a+@dtJmU|9^$p+#Ѷ:Yɻcͳ2d g O:M w ] cB^[7oM'T`4si.j= q^*{-:*ބGyל̆ۋ ]EśOZ7Q֥fhzsh;-̉S*XwUehC|=+ $^'wv_'0_L9lςgU {1H T%l4wk74c%d,] ﯠo''˛:x3yf/b{FBFi ~.2b0}^tZ3>dV?&x^`h3d sZ& d{Ag+.^6৹G9/_/&ݟb,d߀Y\@\ ~ʔ/W8~w+xx1M v> q5(fec$_?% m#BkfK08 l\i缄Z[⽌ѕ6E^/[?a.<KriGjzj]$?V2( m8 ˧@o:C7͂2 ~N7ko̬"YʈRhԕDB}Y6U;{5UT[Rf[ӊ7Bi]eI:tBg'FD[ qD[M1BJHM)TFQ5/_oTtIRLn~Tqr*M䵌@$]j.#Z ZyQ|E!4KU[_=cWģG4*j{MNyo8&{V-7D.UYu~X PqdsX.V\2? d&)fSOr?gu`u`u`uE vj/a88%ўGKBp(qZmcS+0r5'g~0׉[`iZQS߁@Ng0\n_ӬƗ*,5zqPUSJ! &a*%~VYb޲:FaBC4T{#ֈF!$h= Ulbd=YMR_6bVϫ8VUpl:Z(/C>`d uX86 L7g ^zzJA!D+rj@0P9˷ɼmpyގ/"Hw]vO痟_Ԛӂosw϶o-pH!. k:E30d7-!55V"GԌ}7а q sQ̶ Q ߛʅV(t7h1JaVL k#˙%. θ\ciCh'(JY Ü6k8ֆ" *dr%jyJ3Jù.XEY+tx4h7trNN?㩚ET߀zA8.$g?N6ht{?[k*6o}<&0]M2+pWt@{K\dpwG՗0Tg략=w/=a&91y ҭFN*(ap, b2 m9rhL6KxP彪ݰ~YjsNH}~RvNr"?$1Pi 4CP1V}έRijA?ռڸՊYN &Ոw>TD obH ca)"hot(!mi^sOHB`Tc: A,ndd!<"OFyaFU 4]RHsRg: t9o'7mҰSBwe2?ox?bT6@@6̳?{Wƍ K_.ݡƻ|Vm%l9`ITHvj34H 3C'Dpth3k Ipyf#įu#W̭Wr'zLJR{G`U>&ӳUEZE荺? W_AZn'> 1dH">MY0PӤj/]G3zpQNFv$qĔƶ&RQ*NVKAUT'@ j j9;#\SKZYL&ߝONM5y|Z  T 4_ã ƺ"[&6 4J@g@Hdc$ r5KЁ\E:ܕX{DvuMX{#Dc@Ʈxёp~شĸDyY)0+EN&o B Yxyj;nZz-s6v2ryw;p B鉭Dp\ٟj*9]C+=gR"@\=x1o6Hyބ|f? ^mo -y/3f)Fo5tנn 77g3}I,TdռO dlF h(z^M拓r$b(I)VkʦP+-\ybJ=SM+5y6C7j&A)kǁky/w!iɺ%cp?}¡,|o82_\6tk(=6$K>d/(]CxjJ]L!q*ot9]P}?n gpKk"B+RPb;VI /JXyֶHVA5;(Bؙ7x=I;GNW-cX i.i$8`s1;_X繷I( A>|^]\.̲8[irAP{ی^Ʃe^hކ,ٽ8dettvK'q3hJvhSMh6o'da$.er o 7&jzVV`[Zbq#~#MXn0 s?ZQno6y8ՋVM{E#=Vax{`K~VI4 G/S(-q[ʵd{ɭO^?,Vϙ!X ?׵㟺uH!JL WUJSV8%%V\IU YV i$մ^t^˴Ȓ2=2%*,%h]>RໂZAg?s{ yFL۲ J@mJ۾BP_է5'8 <*Tdo%V%HP#(V%*TRj 8OJ(q,Hro[jj2t2< Gנ a]ID±V^#ԛJrD2# m~4vr3~,JV›\"\pP=Ȕ3D[ M)g\U}}*vSTy$<<gxC%?yx"F2#96TN&wU uF@JTb* *Wx؂6h 8.XР*#~Y*+~Y|f0G?=#YOjk[P LÆŸg6˦7ًm:f7پ/]zH,: y{ (pòX։kt2QBq?s09*?Ps:PO!ĄJV!i?(_V`|k__}gK?7ZW{1?1lHP7xm 2!Νe!4acQTsX⌃BABZC+V=3.+;c.vno=2&3uHPYr/l߾8bߑc/0/Hռ?KkRRcf9tu$Fr$8rٲ׭` zn"ᔡz*?gagʖN)k F8WyŊY<@V)%X ˒o׶=pߺϙw}T_mPByx;R饔*,NBJQ}I5g)}}RJ#SUi]R- +w=lFYR 0N <5)dU@5X4J&!Ha:{ V0dFNXowLħ%$NsSP,LᏋ@RRh$4 lbi'c^קE C4̙GxD4̙7xmYdI1 IdTi`.'Tf%A>|^ng479TVJ)`/,K IeQZ `%N0/J* 4D+ ulΊ̦@$0~6zu>x) y.+ipو?bƐ**W/y-vJJKv#s!Hyg"y;8;C@r"AdcD@'Qje?R8+uCXɪA?p<`! FjF/ Aũe!2Q`B*ԠK!NpE9C_Î D)Z*yv)t׼ ƤE3GS9Pj"`1"@ Aq6*Jl]AM J# 9_* [ A7MN]M9x tN#SJKšVyT Uj%dH k$Uԫs^y%LnR q'Q}I5`vvEd:HquRJdT3YJ_H/# #zRHT3.)OYJYdhkB&)x5fbykiB̗Y(JdWMVܩ{X? f!)m{qݘ৐FTC+*&98'aM$L_( qj[Dx@nx Mߛ` o7&ʫ[9\ dM@3RVKk*2И- #F^ݕ\,-[<`m) }[9Q?'[}ɥu>9G(֓xD@JbFa,ĶǦ:y\gfyc1V b )$PLB %VI:1p$a Kz^d0&`)5UkVQ#$2/@TJpȂ%5ub+؏IZVBbJYʒR:Jbxv(qe T!^7yg~^tpW6w} -,CGC{dE?맿zۆfϯMyw3[{ro^^ (w#7K6Pd}{?E7bU=;%?>m1@I? |W% r~*8XMU'h$E-ީjGbPWa_Zξ%H`X"js$0=.FzRO0z0ȇ>"{ B ΁Koge[ܮy+mrgR[GvSqZɶQm֌3% q׍8mFdd8QCNMrp[}Ht$ݔ 5;C#MP)O& 9(K)!] +dwwVæ'.irT2'SQ̗V} ۳{` />$X$B);9#,0YYP@tQaB!WOy{ى8-O‚2΂YBg U[yO x̘"RlbJ1sfJPJ;Q (~o?ϳn=TY>Ss%SV(z\vĉv <9g2 d$wGQu\~ Ivpzm95&HJt܃v<SyciùIBjעZ#]2\)R7ϔd@`!n!8iW}7g{(^M_)LY*Uh)"-HIT]遳Z +H8c n$ )kyKCIʛ_F?Z7g7?GdoZYn'ۍ$7W bf<}Ů `[\U/UyV*) F`\VZo ,,o eJmK- ѕ@Y3ÜU PɬB[*H)jZF`f}7~Tơx;41*6Rg{ 534s@ 7-88xiX2 +QRީK:y.g^RŤKɜP ƬZj '- Gj^bj_-ڀrE"gYPו#{*5AJe}R(BBk:rf^Ztţ7̱Aq[m9|:3nQOot'iȾ~x ;?6aRNnPȟN[ݴ }vdZ,ǻz[Rr ԟʹ <=1OYW[~ɻ.c5uecYaBG)oP^'@\1"H4`\sŊm@H!@ξ@y&>Ka !j.R|KƦ0_g]mF5)ꃝ->:.2wmaҎ-w-.7Px#B T͘"l #55 Ň4fN cgZŸ{ a3D$nЕW6@xyb LMm=|ҧgyUml7Z7 >):dw=Gnr)CqS^GI7!gn}q:}׈n3]r:?:P !8)-=FBAs(6Aq:ֻT/T !8x*BDʥ4t[ΰSm9Cp- KAJB9P2{=bLܺKOleRݫ޶լq1@Az`_=p-U/skkϑ>$. I- ِ%QDfhpm6]F*Phtye{nLH!|l}zx5|(tSa;>-$i&dyu>Ik5ؗl!Lvz k*KʺBQ}LVyp͈ 8zxۨ0BAmSrP7;\k.)K@s$i9<ˏ8ާUӱ,a%L"-Oڙodp,w؊U6ҨTlxWt ~2o}ڄ& R75\-pˏ?<[ nez [Ԍ^_m/8~jm+G$'ϟ?yɟ7;8g򌑭l^0LiȘ*3#d4>ZĩPI!Xat/~??TarWmI:о ,dغgmߕАcFRXg)Vwl}ʿ0>~֬N8B6'VϼmY΄#6S\)HP܀)Q HmfCyeY" "с 'AA9Z{~$RB^]Z$m |F|"@oY^R8o,4g<>A;8}W]E-8^nH"pH@5nlMnH?wKE?<<(gH#h#!dz==*sJ9=SL%dTu0AigABqoa$ِh9'Qlb,%9fGE/:FntoԹéT08ir ^H‿_֤ <@ԛ>Flp=dtr%[lB"ѷ۔;H?fw%m߽ԋ-b@h`;ӯ@U]Q}LVypR嗟lf &n,|L=̥*B M7eI^)J 2C$GxuT庌-P:oQWuᒩG?]iӉ6 I}vnRalve0HG:Lr  iC,T%%+|*V9Hy)%Ș0xEY0''Lk\㟫\̈́^> d)d~,TzitfwlVW*2)\'a YX3UX3AyB$ #qo7:~.g1ߊr_`Y|bwK?aZݑRkSh6c z8AZjEQTFZ2MzXb:?B2oA n_G!u6wcs9(Fd1x=8`Мd1V)H>p}<˭ˈ'JR"-EixY/ y2X\7uc= 7;tx ٗfK52B#v, /Bh& V` wi*sgF;eIJFmM >#I8[P@l|j ⭀L']AAn ӷVγ'i24E)tLr 9,\`kX]|\ $ QՎF|6Bh2'rPF,B96abs_fF1g Z̖ RΞB Qx ;oܗ.0 //[S$(rJt40sY$Gx-vlrbEpr_4F#A\iӉ6$W.a -Γ!"d RTټsEsdEiH547JͨĢDfEG:EFXTP2s7kV93 9TX-T\G⑌Sc2Y഼4(~0h e/q̷F0bӚ\ 394|y&,|\}*%=~x(-%ѐT@sЀQArö'̊K$7(,^ >vsQ LA\a 1L j`1&|+T1n sEo@wGDAAaqphn!r<(2MzX"r␃Y1-h$zܠ޻9gN÷B\ s0/wA@u, :Bn1U *s8gEjTH2g3[d;CAnS<:CcOG]j[ m48QWE!\,9ֻ]%dG۵)X5bN5BJDC?97^8 BoN"SNO,WSnsԄSm{>3qi^?ܼP27Z;?n]e.O~Wqƞ=CYJ=,QMlz{OzefH>jy z (n ҴhdHy0:dg83'ޭk .-`FRK"F86qvK-ީ%3l)c9rH'3z৵`~ IEeJ%ESwTYVp8ظ Y`sd>9ELGC̓u` yn70q} d1Bsm4r%6BH0rܳ =z Q!yׁfĔ 9G'=|rFU2SU?S<'MT9BkKXa2X wtexQ 8Ґ5!$ ool~:ƕ$Vj&}.UV,wm͍8;{ԼUVR3Mon}lw.#Yxt.DIVS,rgg V8fB7OF^و!NCB(>odzsӴ)>^?!O׏]bqL5ѹ5: e9Z:r+(p!QD1.\"HNPp"tD(v4:.@MZ>**yJ*h Ƞig2&bҽ2\tѐ">dz!Tcߖ|uMs.vyz[ϴ,%3M4ō0'ŕbe:\]-7 3*Ff^tӫDXCn&L'١(l[A|sf_ I>, Jwkw3DYd} b[w>F3 ?Y,% ^z_:ڷ'Wʶ]A׮=xVJN""ѹ.5l\nt.0L&)G"(U -A0۰AC#vv> K@xaI܄#|^k'1l\bhu-݅jF{_L͎|T?Ұm]cG  7U%v%һfQ WvR}߯.__]0<]6y>D۸y\xS-ȪQtSߛʼ8_\7ި'/V +$qx^(IKhϠS"|s29OINO3HŹ/UuHJ@$7 'hVuы?FjjzEoWX&ɆC' ؜8U8Vckf*h+fwg6{0Hk{j4T ~9p8Ԗ%$t RI f!*.}ǥE^7=l5R(d:z˯SQ!TjPi%tUo]򎝭h6K^O*H*Tģxd7*S:ěxC ֆ`͆'=b߄%, ԑhe tJkHC-G"`LӋ,1Wxh%[j,O@Χ&M:hbN+o5\PSz.Dܶn[goSV#\#,sx)TlхSZ h޽!3%ƻ?w5O/T"˿һs2zюG],m[CYQ{qz\Zm{Ogirv &BHMTШE %'5}F0>IdIAbb*.VFFV1/6דKFؘ,>BˉZ6B#]FL2|&le} JZApg"|s47I+=c*rб b~~ncLJjim(jwڎPds04x)WܟrC דed5CXT[_OJFi'eOav5*]`ϟ&NsUu}we`G#a?ow/ɒ(t[$;vlʌΆ$91#;fV6+55G Ry1Pra'*t=LݽvaSE!aHHVloOGp׃*0o?c@{FqsKH緹?QdTAo(Po11SN f:ʰ:AdmsLZ8(8fNY `!D!5a+yvWܳuu\X/IqJjA@` "-4 )u o5$Ԗ֣-dxxJTD{\NK{НTss 7N{均6)U1qt?ᬀ _魣M HXc=aJ$QZ*ù~~l_UR5Gfz?2ʆFdiyn&t)Y/$_dIhJb7 sZӚGk|tGˉ,XkP"Ԃ#* wSfL]Ca Vh kC6v7_~-#oAT3oֱ;֮$;!Xx'_XJ]k들΅Û~Z!3LLLLh2U4I8Ƙ|T 0 !e.|㌣ /kAMWѐuQ^XHWc!A(X )> $oBm$htV5\$$3ת44-I ]NHFUsA+$nLq\AƄ$6Ӿb<5Ir%[4Rk\P!M4#fL_z%<_i^Su/~øNjC+OM9t"!{<E`ltޗ[\, ʤH/OrXZPl%l*[01˥]2K;T"6]0H9tYA^=^YX ',Y}8T\ā"$hiGiə1z4T!wIr |pZpRC^>4;CN32j퀧O{~2ry"Eˋ\62HyOP0DVICXG;QJ,`$a 6Z2YFח-@9%¼/T%hfirfM7 \ʹ&0^9AU|R{ ΚEQH2NM;>.4oQVKwsE#+:ϥ1m,mXUalݿi%rOT7/zu_iIzSY? d `3K%W|잮/)^An\/yLâj%v*&|A Ý}~uR l !}tacBn?4 ?V>xěxL3[m!FKq͆'M~!Lxx &-\KS 2h ~҈Hz' ,&6(QLLX .uCƓ"^#Bsi"$kBrHj4) /P(ʉHV#ߝ'n Zoy-JJEDm4AI:STP{"i,jǤwdBUuBQFZ,k&ѢeZD) v6%`0 u=3D5PdVN5mn3Hnt"h}nJqJc[LwemHyk`(q)nS`z}giYr$YSؖl%@DgWU,Fm Kn_⼊%JIy]`paeNTSIY]`WeŋF9EV:VPTj'K,Zs잧\z(¾IZc 0Թ=كIҧ>P툼wr>MFH/|RhX~yuipqxx:t{G-W ;~>di qPB`hEDh;q˝u/x9UP@hmY_kH2䣕Եp闹Oc XWXpX尦gbeU\xǍG^$A`tkޭo7|pd쇛W 5On\L%U}\3D"}ӻ$™YX"׽0&?dWqӛ&˛/`-kPN=J/3Qٟ~m_] +*ўGq^/e~^议8+|87M2n=OreUR?1|Lf k`ψ^|bl KDQIޕh%+Zx^MZ, 2[N)nk]BpH]~'O痫*-U|Jٍ't>e9gC2/F[3\k} `nr wWOXLl?oYiE߬egDVT4zmhUFo!;Ik:% O [E|bx@9#Bw>y39̻S/*(Ts巺O}1$oiGB&~[ݚ`wF朞K?lqo2!} uoBZ|ɲ g[w$7Ǔ3m*@2sˏ9T%M<^IxL3~-!EvB,pyᚥ:Ŷ\=Rlm8Z:-\{u'~F?fHOjK#NӴ>8'_J]ZNb$BeGvW *o3\+/%zrVuKk њ'I j84v }V4|߽v$|w%? {R2ҥ2٠ԉ6zUͶ:cuqj7V QqBHe+GQ}È.5Lۑ;f{՗nظlUsVo}OB|OkO=C  Qsq@.ᄥOun}akQF{zW58Pmc"+?!m 羆UbT' :q[i'THh>됄g:I Gjw5˴QZfO?+@E_wƑvt'ϓȞ ;^3wtn+R8Xl `=?K_wival2 lfliZ#Pڍ#+GN9hة=0k0Ѝ 0;H[9}˟vB-TSz$l*;ȔcEΫW-ʹ)Qg nF3>;(N䈁 $- 59q ;!6xzxEFܬ[wg2p: Uc_J#Ot&7SSZ6sBO?qqNU\ }i6\e5oF_i̾r}AIţehݲ }uv@Hwv֮E#OtX㗶 YZOg<mGL9~?}m RJK@*JϞ$yxvMõ]lʱ@ʼnR?'C7ɗ7~GN9*oqAj2feEEQpnf?Wvz5dxK1|~_Wl]mvd_DiZs^b(Rf;t¢&2k/+t1ǢlQEOwȑ`$i9f'u@|p@JagwO)V>quLPsxe Z/$LXf37/"j`) Y!k<3@!p9t2s٩׷j$FL*g G̝OJ#Y'n<ɒh*N6JhɁ+:)%(.tut- 2F_*Ľ6W=@*Zz9 >9B#v2.X ×wv go0FG 7a128"zZ8( AUd>޾7|SOHde\S3bu <D-Bȍz] 4/稏1ʠF=|#Z$R84NX,᷻{`F]$Wʚ~J8P薮J7|SBEA'FgJm1#CߕN1۷<=IhE*zs҉9][a<&Z*n}XJpu)yBl|@R1 jhýJ-G{=zd*w(Rk>+m)*pr~ήf38pxO_#%A 3̔4YBҶ2BXmR +ٕ2ˎkD\GJ:۪phkxP¢vlOHwwtc(ۻc=` RDP@)s$l,xgV/㓻v*y$Na&2-%!)bh D(rL{Ocj)黎nBZ)veVddY$ѐjgHN{Ϭoދl/ 7wޏ7R0t Hj F"4#Gtć5HL%GDb@9wd}%~\Ε"p{I唁E #eLיUyrpTt_O?W{ƕ rsC6L$/q-Ԥ.`V5i%STl&i:us$zY8GEfF1t).|(gwFӾ9RvaͻDCP=lD9{~NW_:oIUZYT$l ^xnkddu<]QZ.l~3k} .7nqʂ3 xxIB9ri̥Eo`VG !:Y]rA-/UL19' 6\BT9D(owNhDEC(` ?!LJUTHkq3J{4o{lҙѶY~s,C\ &Ib-xc^f.1a3Ae)Òr$v?%&q=Qzx[fr" =Ef |blۣpz}ߝwAopA*p^ eVՏQt􅬮z~Fí9!2i;6Qr(e/A6 v5'~>u}盫<ɸ}2K:3jZQSx46"u.QQ^>3BӄU.rEV#wZV%y YYvJ5m]}5ھ/Ww}7ZoN:&lRDFH0VG m3O%sTƉk?)ګBs Gĕ#Gs-|R?N*NTy=A^,E,%Ze%}4E7V4uCÚ錹o=Q/uG}=ۧ) .l}dr'&\(" i)HWi~qFx3GȐ'yYOe&I!2FxHJ̤B82ͩ(f*~\d"('Ӌߑ~`Z~p3Lp ZɃdhHGݑ„_n~xu>ln36SX#u {priҔvCYhyjm}=FPpyZװb0[ˋbNo=pI3gNޞ4v6s<!/ਥ˼GC67튌Cճ.zۓ/|1c|V-?sOef#};qȇ;8SƔ3p3,䨲N4lC株Ka)W,gR}0S} Ր3_洛r:JY;|*lUJ:=$(Tnr6u2V:cKD,A, "Sp;A,}(sb@/4:Cc>", .^#sL,u:r (҄ }=%1 1nNH#F8P6Xuu? lhC!ﳘ/G9 k7|ņ0]BdL2AEU]+n wXzal!=bAtG/=Ok|ffC]>儁Mw]XT:*VP<6#躿341d/`뻷s yݱ+)g#yME Xp[F<בdPuҾ;>M0`SIw<\V:Ļ,mW OO };fi&*le2%sU"u ~rIɈ\BGC?+߽!1 l" Gyj 6GF磅XzoGtdrFk>4H z vLyʜdo4N<| r<-|msYWIEqyF"B >l NÍvC_} X}' si^Ti3v3K 349}Z-[Y5v}q;MaIQ]FpGY 'c9Dѣ_>}L5[1.wR$UpZ3Tȇ0fk*3i, ٯV?qb/V1&ԟ+Wd9S7RrJ;$a|4ۏ)#GKyT?ՐiWh;=3%рt'gŝM\{"wPZ u|A{u }] CuB*V%J¬,,J4 Ru~ߦfr\˳w֒r\VXFB!c`8# B@!IK,>ԻWJwd}(bgXf DF2B8ǘ@aHǰDkV~t%Ѿ$ohAkߺ %,ΪX32;!^% !\%;Ҍ1`[ rw{ϗ\*({b=vzftGzHǫDgƀGUDOJXTR c:#8$T/C/LsH{$A-.8b!E$X3F!Q\; {0~ n*ɌQrR5Xj YLJA_hxpvQ}jZMa_c&;&"RhVqYT,so?0iWYJ ˕-D`)3f=a8+\r. ;5}o.!O1\g•8'Wϗ(X!3)CY+)'/h`{Sp-p}@ 7}C|'QC/x뽘>]lQ%dG 9g0nB[އ$G^;E*Zm({SSקuzpNŤ1OY?Ncl:l\lVBX4nl9dh(QUҊL/d}[ F|@J&/$N&0~mGvHd= 4%^\ypˤ $9Ɉ^Oaѣح(8MU 1LE-6~)0ö3C/ K#{Dܞ`fL2Ii' QXEԙ`e,x#[;!S$˳rВ$ɜ* 9(c!qGW9QM@S__Usdٱ4.ޟ* ezS UaОƓy<ߺOv| 2o >ӵn4ZVpњp(j"tzO "8CH=ko#7ݢe"8%l. 6홌_#ߏq[nI-[ ^Y:u]XɕATƺu+8Y{M(żt'[|XYXj~L'zYZ.~9NO?R*a1J@*N=؆Hk U fAz>FbALJ4ϘUlі;+R_@ ,{akwH %I cT2~Zk֒}aӋ,ڽtߙt!^p!?D R,j*j4]Jki*MȘ,H"=2k]Me)K XI`46.ltХe饓Z-_ $J-])ΈVjHЬ_ֻe&HI6ZױjP;-PYdKU5CJctRֲi"Rq'Ia',U^ogi燲C,'}Q0.a&t~|X=L1O?E9a&ۧTpyL!T D$⿣_8*݌#?O /Ċk}?e(rG6(u4,ZXG)%k%Ė8DsEwd2҂RGށ?/:#EMm/:AJ 9O6P%?23l)jyB&jLƤt\r<\߾?opBupH G, T(c뷌"IDI21#:;QZik]tYltߟ&lLT;:O8Z+Ě^sFЏ?NN:{SD^MXN:Z7\ڶxwi_f@]v (xke;O ɣ5MeVb>>1+̳TˬW_>\rي[L+]_䳋hOo詈Ve*5e+y9ֹ8@_; sVv-`anF ¢:,#"l^WAkX,HԮ"4+ DX]ki,]Ӭ1KKv$@e_%`߷W-ihаﴆ4ltm@: p1?j xHK|P].NSY%֜9l}S%^C,le(|vp4>[o|z?ϩ­jVUJ6Pu?_aG/'t}cCwi:N>.`Cr0~ly4weȧ8߃%\D >u:@9_OEL-j7"vJ6mތQܨߴv晩7.d \n1!h4hvE2jkڭ{eNvCB޸Tfr#몞[>=?1g-! &QעL̡~4׾u͗3em4+}Z.:ĉt*4Ne{FNnog%&׆"R\8k\\͐$Qc˂c29Q (j&y}e)J:@Eq>F$y%u-Rn=YL?|MmŖ]h گ7dH,a[J:1)QX;g|CN,ᝰGPA{vio˜E2 JF7`TLIB lBm TFKr5:,b%)1+eeI;ek=_U~qsVc5 n^[asPX/Sն$S6W t$\7 O9hLnhb"ǁ6# ; 3i9Mɑ2*Qiy*rv$J_ I꺌t)^5Hl ~=mEGt% #¨ QH *fYRp@rZĕ J*bU?يfhiĮ!$/Me.XR{/'A0r&GD7Q*>`=gaԼ3)e(3sZRZ29OF] BJ5zXqo'*Yj*5ŀ.*nke(Zv: I5hUM5R#a a\>h ̟jS& e20?Gk`yCS"_Q*8wQ }hhzQ#h5H7;(Q=AK1}13EpƬ䲙NI7}#m3nmRLE [7]];$-q u'e|\6P\U僪D啄7gTZ!¥N ucJJӔsmAȫ t;h,5R6){Nh;"WKc#焘%vQ:Z5Oc3C1D|sXW/_g%R. 0"1ԅtѣ@lj*1.8x|g]E#WA`ߎ,[E͢W]2 z)ݷx9&s&UK`veE.*e*@j$hmDw wA6`qzy63ü a~5L`|$Oи (-Ae-6r6@6X1DIXJkT xwaLK/ SmABrlLOMF/C#+ A F0 {Ng5i!Q)VM ƌ{;4Hry-Ј~ޠkGrlcV|&uYyi `aT\TVFH\WdJ]qs|J"/-z2zky V*iP2HwT6@2ǏǷS +?>1Fg=jd{߆[FiY]NxU=@;};K%}}$YspgwXr_]ZM۫>|k[cxΦ}J6!|u4P<:5J ʹD3*>5νi.5ї7"d+g!F]_]iy1BSdI݈ԙ8[9N,HZs\쵡2.+s[7l?A=Y!̫Ja]f^FMϫ.gΫCB޸v) #n]i#:]F.Ie$ (I>]JsP.@ -yR v27j6v)yR))G);}M%r?{5FnAJOJIDZ5J=#?tAww]q;B>.#T\J=$xyhG7gʝOHi)(tw]ɓRB-Lt~K2\Ng9ч~9OW!n b|L!("SGa%$% fF6C}, CJ˹`ܧQM~~lw]b'@ߺN|}^\ٲ8u05n^}RK7F~E#ڿ1Ev}dڙK5.m0TK>"~w|6mzԿ>ɯ4.x%1ABRTP8r,+X1/@wL떎q޴ ѯm/m[)x@ņxoRs?IDSs )3Ѥl 6'৭srd2gHi#DmNZ:(W*"cZ'p\`c؃Si+( 3msPVYԘU#L pRS!9p%Y%uV9jʀzU $)Jsms^082F-6h=]eI;0y]$6R1YW<8늑pn)jQBQ={YT#EM&qHIUX=vDlݭ$Ѭ},;4Gv=]'X؜bTXz^U|WТ(jiMm&)H3>]Q.! %eY8BJd.TX/sYaiS*y5cmr*E'PAޢ-R/uhuPa*k0@Vx2Ur-.. XYVy)PR"A>뽥!OJ5R35i{9hE-tb䆮Ь|گbUbNΣ)S++B`HewVLI1$bCfCP>hmGcۻ?i~\̛Ebδcm !"멺`AyHўr}2ƒ:g*s; Ns @mY L.DA=6PVnxto y`wuH_1r]oV 0OٹEdi1 =r23ߏdOb7[#'@b/Vɪ[Ec_w\+0-UuBi_[ꂰS js|Je^6wD5! (f4>,n%6Onn21wn'C{sLևMtͦ2(0oRa֢o[!fvqsOBTe?lP 'W}*(Y)9bGa/Qajĩ|u[i^ml7*u :9Ixfvd<5!Qy|)DS6~4;#K;5\ɇy9Է76N- m\J*%VtsVcHd]x>ʜu/[qH6tUGzol몣S җLj0T:= (Srѓn#ssBC9M~t 4vKN]G݀j ST -9Iyjʔ^+&5Jke 4DZKTR&%Ѥ _:xGlqVJ_l)XkKVٕ_oOv/nȑȣ*;wK9sF[b ӕ >}gND)/gJNr=cv@U/ TYjoo/N U)X>шBQP[&峇14opױw67(bǻZhG(rȇm-ם`yQ ֚5xX_i}E֚+@wmuj[}Kj[YgB6&xw =Q ДԨAheW$KbȦ-RZOj䤙PKa'>l*'|ս+R;6[8bA)-w&ƶV*EK>լQaR {b/fm=kwULy/MO52YP : ]*qYac~#}%N%&9lW9/XopF`v)1K=;xkuUm*=܍sm{^<0~x'ȅYrďz,TS֊8$)fU°[T"aNbAH]'J \)[ hؑNP6Cd 9r[9`K%YpZFahj8灣Ԅr.P`iVi&TE7!]d7hӽV BুJ5~?Lk\p0S88,Rws/jVbJ# #ٛ㙳ÄF)hZld~nNs?CTPDubv|N_4hDu>3ǖGBѝ$Z`>荾]D.tJP a]D * YGU;Nblggm|'T=i^Ϯu|i}#j1f/O(o.ZmCQx[P ȱ$?yy@9\}to]8MPݞOzֻN>)O)gןbu; ׽HG'pEz'ljL5`Y.y0uq1cj&HJ`T3P!5hE8/=hw׮Ov#}w QJn^嫍kRO, BBA: ChMdIQyŘHS>L}PN+ޝƠyvmn|07ܥ?-?[%Sdv:> ҢQ(TYϡ+Z xT7:vDRQ,5kM7xO74ZI1.5*js.YL L8jk) Lnobi_a[[DSML1!Ĵ{MTQ&&(9hpx^tDH@r& p.!儿ժia n{(Z2^ǟ?RIcy~6tzCT{:)||xKނxIЅN+onfW_G!1{a66Ʉ5csQ1GӪOfvK:))Yǘ7;dlq_Ϯ2_s=?yKSy0~p_^Ww..cџ|gkR%PZqfե]iuy/Ϳ_8a -H1ql2VL 9@VBB~]lqz4?4idH? ӻ\O)^׾ͤP0}ݬpHT[`Z:SR+8\ ܰ/- h9TUׂV _Yhb^b6jIO{kve<𐓙)s$qv<+KC^zQa N&O#с4rmu͞O:hL]a\Ξ 0_4~s,8߲>N͵X" ˕^,}ޙ8pנZP!'D jQG5uV1Եt:?q 8 _NZ!P[DG 8Z5 Z:e#L:녵L!  .pLR-u."Gl<Q#8$edގq%sĀ Jz9O(jD*]_z.8Q%_CLSR܅C=f$r>^VFQyhsBn֩I3k+F<6:jQ1h\-EJQ!ҝM#Pɓ=58<E?CY|e=t_ǀϺT26nSRl=ܼU c_/1nx1UZ]LX *%tBc'#:V8_yXfVoůX*Y]M,}xT•hoi2Z@N 7WdRG}}Z]Ks9+ _#ѧ޽ac0`iMSjޘ R)TE-Pr@KBwN wOHr6gt 5?_lc ٘j~_Yȫu.ć0w!ϪgUȳ*YWw#5+8ODdbJ3C+ rCG/8OّC,7ɤ.כz_}'8ſ7߼ !?}~4$Н*Uy7Eb-)m@SpAdݱBUu&\c+Oޓp`BZ}*1 iD\=ysXs=j k N9mzr<pj8wްiϠRE,wߠHip(K?CZKSOy#-FBCOjcgQ ጳ㧸6t!z;Ok q N|N{1=.gGVF<њ7L}\ps3EG)X0 "bHp%nvⳖώ=I#%~U22,;K6{:9!Htܛ/6[ ō-: !TXб$p*ry+g,JY9?)T#*a倀&n<:SGYjw ^%(*(.'(f|!#.ޔZ: / S>H72QO7usa;ucei-冦!wgbw{ڦƻ|Mܒ1ʎ^ K[c՘ob"㞧2 p\-ܱz=O5 ݓТo+  EBw{YCUĊ!Z hXZFػK ypVNݿ=J"EVS|R%MDSI:-tlQfoUcQ)]MG ̏W]ƭ_V&[k4{C0N~[ Ќ JG bnֿmC5?3t˻m @>n\yxx^:Ĥ~ESItUן|O)_C90b|I牞́ }Uvv,{ȃ<v Z} 㓊sН,ۙ{FݳeU|JsY݂D9u"x17FFVzg{/g6g'FYUR$ȍXV` b?8xf˯+mǁo(ٍxmT-ο4EikvO Pi RĜ؅+OSݒD Ӧ͹BJ6g\P·Pqe$hu :6JBIN-{3bz3ZF8Л;ip8]-y"IEq,D]@UF&lK"LC/8.~8rZ u-4imSVlַJ]Q.NyO[ r+չ4ϐu KZyXҴ.<|ƒ1VIԿ}OXvj#šgs,4~4Vi&~L{ƨ^.-y]M9^P,骭a~gIKOfWݾh,K00'd ).FvOuז{C [ ߬&$CEhTe_4gQW|MK??Ac]rלU\܄kZ@p=跠;%^CF3Z;ߢ@m8οججججk`Σ@ED x y]tXi0sSRޗCF;kǝР·~~`;':-kCʹ-S#&2N diB4dSKMH<%5YaX ݉N|Τ&T_\{NfX(ƞ"?SهVwCi Fi=۩0Nh<D&y쁂,C¹z- atΑFyvf5mjhMx;u}Tү> z`t7N'd C 85~')x>;D PmRȵ _(3/f-#km.-5+EM?U ꯗ/г]eu])-N΢^f?__ϻ$?XǛ'6yM ?S7?^F+5EA0p>ws_xe|m ;Z~='Ռ`AY&3(hXBw4$xw;b;>Iwyw teȻ>7Vw_y~?YrU~yQH*zakqq2\ѼY3&ط/?g| 4S^!rR,/d;CO Ls9 *m$1ZBWuu9.`L0 'bba`^yAt.Ko~4c(\y$|^~$2M*.i\af|+QWc$P̑GJVu}K>e(Pv?_cH5E〽^Y'oնNv7W7[yϼ]2XYHbs8ODrF6Zp&' ϲ (u W.a ^iH@~,\JOX^hL;NBT_aA:y,N]t=SBjq;Z2")ր}IQ ʥ_O Ʋe­C4b) ',nJ_^E~Jdt7 jaX_b4ۇOfC(ZQS NbiчHkm9ʞ͞f_e{(c~3~(lJ1Ye݁AΙyNdOpNkMx6b굄>+y(I:^X2+ӻ)|b^bp@DX)S@2(b<=PBq8J}ѳyi'XB" FFG1Z;8c#;eV[(rҳi.2CDǪ[d)9a0D 6.m\͜*Ƒ,x8 0k"mcMX)Wq()B@Y2M +>")kq͡[G*^Q2/ HD + b !$`ɪ!IN{ ++jB#"^WGSKQ,$7eS$IbG! Co#fHf*Bw#{j34*<2:78C3->;TspSϢ5@wK8|w5Ϩ))"1FTJY8(%M?w<1zJE̅f$isV1y0&E0L{~Ј&R1vDxOA+29MB"r$\dɅ0hh5ڣq[3RNZ~Q0Gt9I%9/?80p#8k_onYu ìe_?,2]OZm7bV1%]~m93dʧˆZ$Әy7[6!rYZ9XBwh5{4%3L hLԴkkd4 ):RVRoxl#;Ḇ:}P1AFRIǐrYqB`(De֎ :uw[U\N$P"51xJQ(;ť{+156\27sfb7]oGW}aW#@>ﰸnd?\A?%ZI%{U i)jhw $M=Hһ3[U5 pO[QA UlۨOJݷ!Gt91WWכqʱCdh|CDB1$uZ͜=$*z.>}I*}}? vɥlE Et:b^;-b#\GdԘZ /~fu$8R%00;alQ4k.dFFS1+3u*‚`jtc)#iУ2"tKź\qMp: &pWcPJ!.Fݯz,MޟE-|R݅.oI\i)qE':5WnYpIG#nt!PҺʹsJ񝫾1TX Ţ -Jٍ_X(8]NgӶ0mSZmui ,K+3:=X![\RPuEdMTΏ G`gp\qAk;q:!?u"T>y I9zA^$*Q7FHapJM}Gao[=iRHnr5[MRf95QyOEdLUܞiR^,jq9q-J/NxQjdǦ97Bm9tƈahOQ‘CeE!oIΕ{Yn9H Ejݗ4JzVy DQqǹz9|8^5ˁՏR7*R Q5Xc)/&ڤIO]*By~:s W U&dl*X("ˠUQ}Sj^+*t8yع8"vg Zk1rw}8ロ =^]\˭&ejT]pLY0ϵY#(a2&zP6o/ ӫQ&%>;:.Ň ݷqU^Y/f/QX3tHG?؉Fwf$w:saD6Yu />Z>Fڂb=76xv3]x d;#nUi9sNaGQ;~`VדVq=mHEBPԴ׃W#W(8 U .gt`k:|R JK]ZUM:g/ fv^L)?nǓ L"m { ϶yvd5?xè۹ap@]:یw})ZјVBP-kߵؿ26/,R^F\H5t=`XGC;Q+c($2fж#S\t v0PF@75$T^2OZ@>L: WbJrYA*sJ %$e@?. p}SrO) K=Hש ԡ Df&Ց8Y%+U-50cAFs_-TnO[ w I &Z]jɈ`7ĔeR\rh%'0Z&h+Lcɠ`~d\s.dL&Q 'usΣyi~_◭ YPi#by`I=ҁ50e\;H)Uqpz0\(a[U%vm-@1j,] $VPL1)Zl>}t!e`oEnLB'Y(ł/4cE|T 8%7d*`3h/zf!14J ϒ9W: 95\w A %Gb6gУ.jN3øS0O0jF$Bɜ.~A}L wRm5̇E5վC(Oއ?K26TO)Qs mUh9\ITq~7$,' KPcNVࡆb^;oN~19>chʴcfi@v#H1eeg3 :xY'6.l P%/8 T5يKl^hi7m}5.Λ*hCЬ䣄T^ .>KvMF٬S9twdb!̂.B|r'bXPQ7 EuOny67GnpC2"A}Q؅hJoKM#g -/<[WSbmOIiNw>ywl9sz.2u֞GKY-|D=9б7!G7ߞե9{u%dImن0Xhr^- ~Q]hwഷhC'A_L8Z٢*T/`ϩЙar\m{*߸}Y|'u=\X+ K$֑EzfwGhu2lٽUTvB<'ih-Cw>R/;?y_4dA'Hh'yX62J\fWRke.P7NeyL|h<8}::VrM.ߣĴ Hh5𘟋#ɢ[J^|p6|j4Һha-fKytġ00<z{/yMP )+~l,R,mbJhOFf5ciIS5 ]|io~|4N'Ԃ|a*_ɣ\2٣mf. ?a sBFyjH铜>Ώ!õv]?d"^~ߗ ,)~U+-mCv_Ãozz yl gwR)ny8sAfAL`6-$u vQPq?%.k~3 $W?ߊ%-Pbv5T}٣E/˳tY,6W0nP8N0L1~+in-/Աƨ?|뿽+>V4%4&ͮfM|mYdz뻒n?Ÿ~Zu)?&7WeɓnvOkR3R*%סּt|}SH|b:y9ϨK L9+-5K2 a^=hUOgQ]͟^2܎./WA jmd۷ԓwG魹Bh$QaFX aJ7' xS/+?ZW~@ ß.AImX3kVcqŲ\[=?TA>φ!lssP! !ڢdI5e藤] ZWQq]x < -İk$qϱ7r.h:nPAiU{|\W4G#Z QU tE]tVc#;Q}/EM񸂅ۀiȞ0Zr(;(t:icIYMI`;DqXtG"4S+J.J;Mr,t^\n\s!sQxf> Tͪ,~^ҚaٺxGոw/1<^SܜA?Wq1LnQgNOuWrYB(д5US>?tqF_s9[H&A?h $C%tZxJ?%Y~(p[^9P`<H{a_zZǦROB F[ķ>^o}_~<ՂmۡT`fYnܬuW io&%Plw35拃HIF-,YdF%LQQ4;At6mBfiW_: t|KL"3bҩ()Q,]9UnZ;յ4 ?[ܵOc}:VT rfe׮%lWw)ՖO@Sv\ېe[C@*G[W6t7ƮZ΋蜉fw}^Zҹ~+Hz 4NL= B̜KL3W~ôIr]-ٱ=qX޹t'vߗ!2ONY 篬+ NK4p|7iqW9Pݔ]=ThMDc@#zÙLjݶ4r%V-攑Cj QB,p[JץD%2^r1V3) ɻ!ȧF|LYl O3!dЅuB(f.%s@>cP>/Qa<..ԤǸ&EˋT=W8c6wSoGLR I7UK _L'~tmfPn;y*͡>=(Kg¹Jzsq pi*U7-&q+-D 28!;fH*T%b:>WJ_D ʇ]JS2TJ+G.|5϶GPv B\D 1q!Lt;6 ,NT)r MDuPޜ%+,5~d 4;=W6kR%_ݝNaJDŽ~N[ /}i?q-??GOs| ]4ZKDv]@ 1MQdz7ߏxagO q 0\?^]{%`되VV)R-i_EeSM]{S:hH2KhU a+=+SVׯ9NNWM?}$ F8n!mDo0,@{RY6 v sË1TAG¯O @PETw&ԚAj'ƷW>83^TaVM<ߋn55fǚVHn<,ߜ3*lE虩"zg co*F~Ły [qg"c=AVΚŬTEߎύ+4yjNӘu&ţfG5c3m4sN;lsv޵~m6(0Ձ+k.WۻW7iCd{M6KS!`{ {²;()%yMD za៕O4f|iDڄ Qݨ[ n?j&q-6n:!# ^&ZQn?jũ7j9?Tu/-/{cON7k5zY}'.oG>5eZd5[c.AP3ok?~p|gt2/ݢreLwƫ }3pOkud!D[Tġ*MA}FBMez.,䕛h+xwM0A}F޼[Sgޭ y&bS巌쯔P;AA&x0Н;Όܗ4-^Jn#(ե V?2并ʀγQ~q=syɂ.$cӑyb NҪ ,\7-b:@k=m -5qƽ3#!dy&,XG8/4UqD,PPmрv1_^y! PE 8Rh.eŻ|?Ԋ yb,$%J)֋ 򤶉[Gcq䷳2Ap4*4P]Puwj3Iw~ME <H5l P$, N0 n=3`W[JU.2ZW62WJrnVr$">-pŗ3p,kMZ _X `1Jӽfƙ8lӒkѫ\WWJ*&HM8ÁpNȁ̲fpt-QIgMbU͜j}^dYax^_^_1 E!zWOnԺdʛ߆_|bi6Q5۱]Non M~QR+h.fԛUw (>NR~UE\\^N[9Q.|/^J1T~ڿkEj$ H ZSVXAҖk;܎Z>TSׅxCX,u l=vzxN 7D4#&G~ubnn *.JO25ԿBY"35Eekz٥\ e_c&eD֨ƪT*YTR[٣JF%RJ|R=R!k+qoup5Y(}0 즃zf@isJkt_aZ?{oW̸onSן*̎,^DV2#__ xj NeF͠nGFdSש= TZbH4۔zbw?>N}n2` c}k9x׏T?NgvM(,+G JUc=!Eb3 zf7q9]G)cTUߛҀuxx_EŨUIta_osuӑnbNj+q3I:<Ӂ`^ṯɜSÃWZ^I ԠZ(@R JFRB21R[olC%et(QJK }Hi9xVs5A^&C7ev\#" P;?Kv|a>D&a7/6ۋyDG6@)RZ0WK6Y$56,nN8L(8Z#(/)X|70Po @kV4e\.T <Ƒ!ڪ;ui/_cDらBNX0}OZJki3ͯzj:4>]uivN{B;wL߳F`Kit%of(uDtK.뻬Tш r70syZw57WT5U;M4s4o47 lN ~s짵njq:Ҭ1yYFgZwV}^*Bu\4qڰ{ ^b)é,$1~*Ht.&DХӯ1c8I,S$:yKor1?W~W }WV|;6Oq?V[lg?drgwiKt!'x@g˿O`|+<l/c0JPSJF#hsKĥn m 2&b 2kl&ŭƉR3t;Y>#ǦMv2ƶEzx%1Rozq_igDS8 i5;/ /v.-[.ەT^f%S$E"9SS>`x)Ҕ(2zRpm ny=mʻ+rqa~Ր>*w'_Պ_=i|/ЧK)j-~C3SKI3z C}# OϿDHdorx|d@TvkWyˠ  L0/U>eRM@0/]$ǚKRüjںtc8\-6rT1}7f夲u!6ڑ'FS +щTWӸDutxo 5(3{ YnQ#])h"V 5XSb`l鑹\O5YQw~X!}nDxˏkOwHC~4D&**IW$El yyc]n6*B^"Zł lv :Sfi=tIJ4ƾ~9g,Ds=,Oǜ:9h ?qTALP8qλq4 Izvv` |HIpbBbxl6b>BnWHen||ƕR`Uƀ+hmN,cf86q/cYg cv7sU8SD֑T,tʢutaJ[ɼ,.2u[7:̕6A)!`w2a9IDCNА\Ect M޶nк DubźMzѬ[|nuBCsS$67@ 2OZ2G- z޼;h nU|l")Ff SCLA7>xR8yZiO8DMԨx'73ZU,f<~7֠np;4VHz&CX-lr!yޗ ݍXxc-#KܰCyX48Yf4%lvd6'9p!:CLe4dǍ +0^fv<-#k/Agz,wMW;m뛬G8l誼r:Hщ!~èhꎎϹrf:x*~FG$Q5{>B;cSW.@lx BnPT Z{GZ$;5jb}{6ڞ ЛjPB!# b P:0EJh#!"?KqUpwMR$wl.j(@{U+">{)lA1{wb$[c$ơ|KZZػ$"fbQd'_5A$P)VEGϮlĮ^]@`s5Yd5Ma곇Fgo18'ARlV|}v1Y<.e~˫.|!i}r4)nݛ,!>=I?D{ a .(b >lhmuxE0hRL~3q*D"2O,%θ2%h+)d;-^f#"u;qmF@=RȱJGF>˦lP;ͩC9 &8 Jo?=]$W/|8󵸨u[#oOIGq_yq_yq_yq_5Ž!pL}TI Ңti*2\^dA_?٭/} f }^_Q`X_e|6n(_[8zÈ $X d=\@ y!6+'7V웩Go?qc3Vpi)2TVܘ*JSΉTʒE)HӼ1S,}YNT͔hv# F B^&шbC'@-Mt~'hT֙Hw<D˝nG=o?ñHIa:6`,ҚA;I԰Wd5c..i9"Te6%AB7si*i2M:iiWYQnʦ%`\AvR!2TI+2*iy;)4iY,JuA$MeWFƨʜRY[(+UL*cAR[eVʙ,ѦRn4 .Tkcl}Ob`HR3z٦|F8y-+s0 -j \tZc;lLPwejxRaZ M%Yh)0-]Qy?m-U&8 -U3ikiXM[䣕rZVZ+{N@9a-",ϲ~n! J&߉$#ol^v٦tlZˎ9J< FWN@nBFM0FN4'a5h7⽮@7Y^gb{v|܀`HXv=+S ',&T,WYы~NGn6rC^ף()2%#l{g IE-|N8 J7pG1^&:k)>h%Oj39rjN:bi*ҸJ*3"ԑ xdV"-l*h#GQ-|c6KKpX,X[ܱ ]r*4EEAe%2B2qu+\\yujUauYY͘q7rAȁB'%,||r5-Żze/>yKIsGy/oH~_Kou]ፑpʒ~ŧ7? " Y'//>w?U硙Yϲ GdGs?\_X~b#0 *-ơ\ڿ 奍 k[E Bdy9A d 8v.Akm('Txڐ}o8`mJVj9uw\!Բ?K-WOK-L2RCZv(/ ZX:ZvcB -8!2zv1k:jeJ-1ں@JZ#l`@ \j y!#D,$1 R̡RːvZd-To9$^6|x^lzv1"LyNu[ @(Eޟ0fNf!;srLnTzƀ~mYCT/vꜰ|kK0-UNu Қj-9tZJ'Z{BK)0HZSɪ@G6@ªham {g-=i-P'cg ô8{kvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005357363015147563422017722 0ustar rootrootFeb 25 10:52:53 crc systemd[1]: Starting Kubernetes Kubelet... Feb 25 10:52:53 crc restorecon[4686]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:53 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 10:52:54 crc restorecon[4686]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 25 10:52:54 crc kubenswrapper[4725]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 10:52:54 crc kubenswrapper[4725]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 25 10:52:54 crc kubenswrapper[4725]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 10:52:54 crc kubenswrapper[4725]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 10:52:54 crc kubenswrapper[4725]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 25 10:52:54 crc kubenswrapper[4725]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.926095 4725 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935811 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935877 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935888 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935898 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935908 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935916 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935924 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935933 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935942 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935950 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935957 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935965 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935977 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935987 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.935995 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936003 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936010 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936018 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936025 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936033 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936041 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936048 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936056 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936063 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936071 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936082 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936092 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936103 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936113 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936121 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936131 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936142 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936150 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936158 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936166 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936174 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936182 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936190 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936198 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936207 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936216 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936224 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936231 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936239 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936247 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936255 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936263 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936271 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936279 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936287 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936294 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936303 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936310 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936321 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936331 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936341 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936349 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936360 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936369 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936379 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936389 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936397 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936406 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936414 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936422 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936432 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936440 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936447 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936455 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936462 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.936470 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937571 4725 flags.go:64] FLAG: --address="0.0.0.0" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937597 4725 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937618 4725 flags.go:64] FLAG: --anonymous-auth="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937630 4725 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937641 4725 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937651 4725 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937664 4725 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937678 4725 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937688 4725 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937697 4725 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937706 4725 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937716 4725 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937725 4725 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937734 4725 flags.go:64] FLAG: --cgroup-root="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937743 4725 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937752 4725 flags.go:64] FLAG: --client-ca-file="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937762 4725 flags.go:64] FLAG: --cloud-config="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937771 4725 flags.go:64] FLAG: --cloud-provider="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937779 4725 flags.go:64] FLAG: --cluster-dns="[]" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937791 4725 flags.go:64] FLAG: --cluster-domain="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937800 4725 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937810 4725 flags.go:64] FLAG: --config-dir="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937818 4725 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937852 4725 flags.go:64] FLAG: --container-log-max-files="5" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937864 4725 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937873 4725 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937883 4725 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937893 4725 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937902 4725 flags.go:64] FLAG: --contention-profiling="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937911 4725 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937919 4725 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937929 4725 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937937 4725 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937951 4725 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937960 4725 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937969 4725 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937978 4725 flags.go:64] FLAG: --enable-load-reader="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.937993 4725 flags.go:64] FLAG: --enable-server="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938002 4725 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938012 4725 flags.go:64] FLAG: --event-burst="100" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938021 4725 flags.go:64] FLAG: --event-qps="50" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938030 4725 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938039 4725 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938050 4725 flags.go:64] FLAG: --eviction-hard="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938061 4725 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938071 4725 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938080 4725 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938090 4725 flags.go:64] FLAG: --eviction-soft="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938102 4725 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938112 4725 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938121 4725 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938131 4725 flags.go:64] FLAG: --experimental-mounter-path="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938141 4725 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938150 4725 flags.go:64] FLAG: --fail-swap-on="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938159 4725 flags.go:64] FLAG: --feature-gates="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938171 4725 flags.go:64] FLAG: --file-check-frequency="20s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938180 4725 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938189 4725 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938240 4725 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938251 4725 flags.go:64] FLAG: --healthz-port="10248" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938261 4725 flags.go:64] FLAG: --help="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938270 4725 flags.go:64] FLAG: --hostname-override="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938279 4725 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938289 4725 flags.go:64] FLAG: --http-check-frequency="20s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938298 4725 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938307 4725 flags.go:64] FLAG: --image-credential-provider-config="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938316 4725 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938326 4725 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938335 4725 flags.go:64] FLAG: --image-service-endpoint="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938344 4725 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938353 4725 flags.go:64] FLAG: --kube-api-burst="100" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938362 4725 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938371 4725 flags.go:64] FLAG: --kube-api-qps="50" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938379 4725 flags.go:64] FLAG: --kube-reserved="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938388 4725 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938396 4725 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938406 4725 flags.go:64] FLAG: --kubelet-cgroups="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938414 4725 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938424 4725 flags.go:64] FLAG: --lock-file="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938437 4725 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938447 4725 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938456 4725 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938471 4725 flags.go:64] FLAG: --log-json-split-stream="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938480 4725 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938489 4725 flags.go:64] FLAG: --log-text-split-stream="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938498 4725 flags.go:64] FLAG: --logging-format="text" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938507 4725 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938518 4725 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938527 4725 flags.go:64] FLAG: --manifest-url="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938536 4725 flags.go:64] FLAG: --manifest-url-header="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938548 4725 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938557 4725 flags.go:64] FLAG: --max-open-files="1000000" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938569 4725 flags.go:64] FLAG: --max-pods="110" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938579 4725 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938588 4725 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938598 4725 flags.go:64] FLAG: --memory-manager-policy="None" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938607 4725 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938617 4725 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938626 4725 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938636 4725 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938656 4725 flags.go:64] FLAG: --node-status-max-images="50" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938666 4725 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938675 4725 flags.go:64] FLAG: --oom-score-adj="-999" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938685 4725 flags.go:64] FLAG: --pod-cidr="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938693 4725 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938710 4725 flags.go:64] FLAG: --pod-manifest-path="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938719 4725 flags.go:64] FLAG: --pod-max-pids="-1" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938728 4725 flags.go:64] FLAG: --pods-per-core="0" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938737 4725 flags.go:64] FLAG: --port="10250" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938746 4725 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938755 4725 flags.go:64] FLAG: --provider-id="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938764 4725 flags.go:64] FLAG: --qos-reserved="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938774 4725 flags.go:64] FLAG: --read-only-port="10255" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938784 4725 flags.go:64] FLAG: --register-node="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938794 4725 flags.go:64] FLAG: --register-schedulable="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938804 4725 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938820 4725 flags.go:64] FLAG: --registry-burst="10" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938852 4725 flags.go:64] FLAG: --registry-qps="5" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938861 4725 flags.go:64] FLAG: --reserved-cpus="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938870 4725 flags.go:64] FLAG: --reserved-memory="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938882 4725 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938891 4725 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938901 4725 flags.go:64] FLAG: --rotate-certificates="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938910 4725 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938919 4725 flags.go:64] FLAG: --runonce="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938928 4725 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938936 4725 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938946 4725 flags.go:64] FLAG: --seccomp-default="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938955 4725 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938966 4725 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938976 4725 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938986 4725 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.938995 4725 flags.go:64] FLAG: --storage-driver-password="root" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939004 4725 flags.go:64] FLAG: --storage-driver-secure="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939013 4725 flags.go:64] FLAG: --storage-driver-table="stats" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939022 4725 flags.go:64] FLAG: --storage-driver-user="root" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939032 4725 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939041 4725 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939050 4725 flags.go:64] FLAG: --system-cgroups="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939059 4725 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939073 4725 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939083 4725 flags.go:64] FLAG: --tls-cert-file="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939091 4725 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939104 4725 flags.go:64] FLAG: --tls-min-version="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939113 4725 flags.go:64] FLAG: --tls-private-key-file="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939121 4725 flags.go:64] FLAG: --topology-manager-policy="none" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939131 4725 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939140 4725 flags.go:64] FLAG: --topology-manager-scope="container" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939151 4725 flags.go:64] FLAG: --v="2" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939162 4725 flags.go:64] FLAG: --version="false" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939175 4725 flags.go:64] FLAG: --vmodule="" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939187 4725 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.939197 4725 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939419 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939431 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939439 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939448 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939458 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939466 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939474 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939482 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939494 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939505 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939514 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939523 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939531 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939539 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939548 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939557 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939565 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939573 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939580 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939589 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939597 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939604 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939612 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939625 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939633 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939641 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939648 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939656 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939667 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939676 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939685 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939693 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939701 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939708 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939730 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939739 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939747 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939754 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939762 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939769 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939786 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939793 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939801 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939808 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939816 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939823 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939873 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939882 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939890 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939897 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939905 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939913 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939922 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939930 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939937 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939949 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939957 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939965 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939973 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939984 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.939994 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940004 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940013 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940022 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940031 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940041 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940052 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940061 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940069 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940077 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.940086 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.940112 4725 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.954532 4725 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.954815 4725 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.954973 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.954990 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955002 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955011 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955020 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955029 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955036 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955044 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955053 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955060 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955068 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955076 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955084 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955093 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955100 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955108 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955115 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955123 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955131 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955138 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955146 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955154 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955162 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955172 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955181 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955190 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955199 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955208 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955216 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955225 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955234 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955242 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955249 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955257 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955290 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955298 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955306 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955313 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955321 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955329 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955336 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955344 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955352 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955360 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955368 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955375 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955383 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955391 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955398 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955407 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955417 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955427 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955438 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955450 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955460 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955472 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955481 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955489 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955497 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955504 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955512 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955520 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955528 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955536 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955543 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955551 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955558 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955566 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955573 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955581 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955591 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.955605 4725 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955888 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955902 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955912 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955921 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955929 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955937 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955944 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955953 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955962 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955969 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955977 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955985 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.955993 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956001 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956008 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956016 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956025 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956033 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956040 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956048 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956056 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956064 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956072 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956079 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956087 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956095 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956102 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956113 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956124 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956133 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956141 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956150 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956158 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956167 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956177 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956186 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956194 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956202 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956210 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956219 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956228 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956236 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956244 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956252 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956261 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956269 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956279 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956289 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956299 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956309 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956317 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956326 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956334 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956344 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956352 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956361 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956369 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956377 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956385 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956393 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956401 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956408 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956417 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956425 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956433 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956442 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956450 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956459 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956469 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956479 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 10:52:54 crc kubenswrapper[4725]: W0225 10:52:54.956489 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.956502 4725 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.957797 4725 server.go:940] "Client rotation is on, will bootstrap in background" Feb 25 10:52:54 crc kubenswrapper[4725]: E0225 10:52:54.962556 4725 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.967131 4725 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.967281 4725 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.968965 4725 server.go:997] "Starting client certificate rotation" Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.969015 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.969296 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 10:52:54 crc kubenswrapper[4725]: I0225 10:52:54.998317 4725 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.001960 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.002029 4725 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.028905 4725 log.go:25] "Validated CRI v1 runtime API" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.073095 4725 log.go:25] "Validated CRI v1 image API" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.076275 4725 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.082628 4725 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-25-10-48-49-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.082689 4725 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:46 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.113890 4725 manager.go:217] Machine: {Timestamp:2026-02-25 10:52:55.110158436 +0000 UTC m=+0.608740531 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:aee608f3-29ba-451f-a6f1-6eeae4d0f001 BootID:a6d2d14d-afd1-48db-8d7e-cf300f526a2d Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:46 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ae:ad:3e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ae:ad:3e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ca:44:87 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3a:5a:2c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:fc:7f:2b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3b:6e:50 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:42:60:1b:c7:8b:d6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:e2:2c:40:7c:01 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.114559 4725 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.114929 4725 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.116327 4725 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.116928 4725 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.117024 4725 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.117560 4725 topology_manager.go:138] "Creating topology manager with none policy" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.117589 4725 container_manager_linux.go:303] "Creating device plugin manager" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.118186 4725 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.118266 4725 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.119243 4725 state_mem.go:36] "Initialized new in-memory state store" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.119434 4725 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.123167 4725 kubelet.go:418] "Attempting to sync node with API server" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.123225 4725 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.123321 4725 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.123357 4725 kubelet.go:324] "Adding apiserver pod source" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.123386 4725 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.130194 4725 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.130803 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.130861 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.131015 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.131031 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.131631 4725 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.133654 4725 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137423 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137498 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137523 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137542 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137573 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137592 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137610 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137642 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137665 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137685 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137726 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.137743 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.138925 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.140211 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.141567 4725 server.go:1280] "Started kubelet" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.143002 4725 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.143184 4725 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.144164 4725 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 25 10:52:55 crc systemd[1]: Started Kubernetes Kubelet. Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.150538 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.150606 4725 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.151910 4725 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.151954 4725 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.152013 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.152215 4725 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.152216 4725 server.go:460] "Adding debug handlers to kubelet server" Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.153314 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.153409 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.153862 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.153585 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189777de5257b8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,LastTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.156780 4725 factory.go:55] Registering systemd factory Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.156813 4725 factory.go:221] Registration of the systemd container factory successfully Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.161554 4725 factory.go:153] Registering CRI-O factory Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.161579 4725 factory.go:221] Registration of the crio container factory successfully Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.161673 4725 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.161722 4725 factory.go:103] Registering Raw factory Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.161746 4725 manager.go:1196] Started watching for new ooms in manager Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.162525 4725 manager.go:319] Starting recovery of all containers Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169567 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169681 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169705 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169726 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169744 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169763 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169783 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169803 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169859 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169888 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169907 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169928 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169946 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169968 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.169988 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170006 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170024 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170044 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170069 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170104 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170123 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170200 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170224 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170243 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170260 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170280 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170370 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170390 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170408 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170428 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170445 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170464 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170481 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170498 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170516 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170533 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170551 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170569 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170589 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170606 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170623 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170640 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170659 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170676 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170704 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170722 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170738 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170757 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170776 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170803 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170849 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170867 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170893 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170915 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170935 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170958 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.170978 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171012 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171066 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171092 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171118 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171145 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171167 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171196 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171222 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171246 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171272 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171295 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171319 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171345 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171372 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171428 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171451 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171471 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171494 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171525 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171544 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171564 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171582 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171604 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171624 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171644 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171663 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171683 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171702 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171721 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171751 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171770 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171792 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171810 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171856 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171886 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171916 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171952 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.171979 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172001 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172021 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172042 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172061 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172079 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172100 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172120 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172140 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172159 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.172640 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.173924 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174030 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174106 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174144 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174210 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174253 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174312 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174365 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174403 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174453 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174489 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174542 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174577 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174609 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174658 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174692 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174745 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174776 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174811 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174913 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174948 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.174994 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178434 4725 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178538 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178576 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178664 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178697 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178733 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178761 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178812 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178881 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178910 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178931 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178957 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.178980 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179010 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179042 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179066 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179103 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179133 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179166 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179201 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179232 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179272 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179304 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179337 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179369 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179400 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179433 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179461 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179497 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179521 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179547 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179580 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179602 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179632 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179667 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179690 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179722 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179746 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179769 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179800 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179900 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.179983 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180007 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180029 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180058 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180083 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180111 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180132 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180156 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180183 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180205 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180235 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180257 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180278 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180308 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180353 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180415 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180469 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180542 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180572 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180593 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180620 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180650 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180674 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180706 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180730 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180753 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180784 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180806 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180868 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180891 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180910 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180939 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180964 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.180993 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181061 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181082 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181145 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181170 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181823 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181901 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181923 4725 reconstruct.go:97] "Volume reconstruction finished" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.181938 4725 reconciler.go:26] "Reconciler: start to sync state" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.196733 4725 manager.go:324] Recovery completed Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.207124 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.209016 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.209050 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.209058 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.209916 4725 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.209934 4725 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.209963 4725 state_mem.go:36] "Initialized new in-memory state store" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.219159 4725 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.222852 4725 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.222943 4725 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.222983 4725 kubelet.go:2335] "Starting kubelet main sync loop" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.223050 4725 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.226010 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.226167 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.236111 4725 policy_none.go:49] "None policy: Start" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.237222 4725 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.237291 4725 state_mem.go:35] "Initializing new in-memory state store" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.252857 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.294973 4725 manager.go:334] "Starting Device Plugin manager" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.295318 4725 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.295343 4725 server.go:79] "Starting device plugin registration server" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.295844 4725 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.295879 4725 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.296117 4725 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.296245 4725 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.296257 4725 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.308207 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.323154 4725 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.323248 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.324559 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.324625 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.324646 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.324986 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.325084 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.325126 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.325973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.325996 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326033 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326049 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326059 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326173 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326390 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326410 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326936 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326956 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.326963 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327028 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327329 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327354 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327437 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327473 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327497 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327740 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327770 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327876 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.327981 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328001 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328012 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328077 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328123 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328559 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328570 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328703 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.328736 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.329044 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.329084 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.329101 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.330434 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.330463 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.330474 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.354698 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384346 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384390 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384435 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384458 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384498 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384519 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384537 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384577 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384598 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384618 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384658 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384681 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384702 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384744 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.384765 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.396936 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.398242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.398283 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.398295 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.398319 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.398951 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.485693 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.485779 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.485876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.485912 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.485950 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.485985 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486014 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486024 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486076 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486114 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486044 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486191 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486186 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486203 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486229 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486254 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486279 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486292 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486296 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486334 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486352 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486393 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486426 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486256 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486458 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486337 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486493 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486664 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486723 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.486818 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.599677 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.601776 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.601873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.601901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.601970 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.602582 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.650393 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.660024 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.686755 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.702460 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: I0225 10:52:55.708440 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.711407 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4a0596ebd48a8009699891419308548bebeaf5bbae1c8d278ad088b8c3af249a WatchSource:0}: Error finding container 4a0596ebd48a8009699891419308548bebeaf5bbae1c8d278ad088b8c3af249a: Status 404 returned error can't find the container with id 4a0596ebd48a8009699891419308548bebeaf5bbae1c8d278ad088b8c3af249a Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.734163 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b3d2ca930119e7135cf5de811fdd1475acfe707796cb2f4ccfa94874d07d2939 WatchSource:0}: Error finding container b3d2ca930119e7135cf5de811fdd1475acfe707796cb2f4ccfa94874d07d2939: Status 404 returned error can't find the container with id b3d2ca930119e7135cf5de811fdd1475acfe707796cb2f4ccfa94874d07d2939 Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.737951 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-72e8eb4cf96cd30090203cf690f6213a444317be3b0ec1a124e64a412526c4ae WatchSource:0}: Error finding container 72e8eb4cf96cd30090203cf690f6213a444317be3b0ec1a124e64a412526c4ae: Status 404 returned error can't find the container with id 72e8eb4cf96cd30090203cf690f6213a444317be3b0ec1a124e64a412526c4ae Feb 25 10:52:55 crc kubenswrapper[4725]: W0225 10:52:55.743810 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e420e29bc3d1582be8cf805aacc55053c8c453450f96460b6bb4540c3007c9a4 WatchSource:0}: Error finding container e420e29bc3d1582be8cf805aacc55053c8c453450f96460b6bb4540c3007c9a4: Status 404 returned error can't find the container with id e420e29bc3d1582be8cf805aacc55053c8c453450f96460b6bb4540c3007c9a4 Feb 25 10:52:55 crc kubenswrapper[4725]: E0225 10:52:55.756302 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.003027 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.005392 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.005469 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.005494 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.005548 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.006367 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Feb 25 10:52:56 crc kubenswrapper[4725]: W0225 10:52:56.138023 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.138134 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.141259 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.229494 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"60304888efbdddc8541eb7da22594311ad77780ab55e7a10061d1c9edbdc7cb7"} Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.232286 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a0596ebd48a8009699891419308548bebeaf5bbae1c8d278ad088b8c3af249a"} Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.235796 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e420e29bc3d1582be8cf805aacc55053c8c453450f96460b6bb4540c3007c9a4"} Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.237397 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72e8eb4cf96cd30090203cf690f6213a444317be3b0ec1a124e64a412526c4ae"} Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.239054 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b3d2ca930119e7135cf5de811fdd1475acfe707796cb2f4ccfa94874d07d2939"} Feb 25 10:52:56 crc kubenswrapper[4725]: W0225 10:52:56.414897 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.415018 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:56 crc kubenswrapper[4725]: W0225 10:52:56.493500 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.493644 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.558555 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Feb 25 10:52:56 crc kubenswrapper[4725]: W0225 10:52:56.699227 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.699373 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.806475 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.809407 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.809473 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.809489 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:56 crc kubenswrapper[4725]: I0225 10:52:56.809530 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.810180 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Feb 25 10:52:56 crc kubenswrapper[4725]: E0225 10:52:56.879492 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189777de5257b8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,LastTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.122242 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 10:52:57 crc kubenswrapper[4725]: E0225 10:52:57.123926 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.141469 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.245087 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.244931 4725 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3" exitCode=0 Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.245208 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.247185 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.247238 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.247253 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.250639 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.250707 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.250726 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.250740 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.250745 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.251893 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.251924 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.251936 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.253423 4725 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285" exitCode=0 Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.253470 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.253548 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.254768 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.254808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.254851 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.256956 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e" exitCode=0 Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.257065 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.257176 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.258229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.258390 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.258424 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.259265 4725 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af" exitCode=0 Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.259336 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af"} Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.259521 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.260645 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.260710 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.260758 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.260780 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.261256 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.261292 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:57 crc kubenswrapper[4725]: I0225 10:52:57.261304 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:58 crc kubenswrapper[4725]: W0225 10:52:58.136768 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:58 crc kubenswrapper[4725]: E0225 10:52:58.136923 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.141727 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:58 crc kubenswrapper[4725]: E0225 10:52:58.164169 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="3.2s" Feb 25 10:52:58 crc kubenswrapper[4725]: W0225 10:52:58.236129 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Feb 25 10:52:58 crc kubenswrapper[4725]: E0225 10:52:58.236233 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.266043 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.266098 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.266110 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.266225 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.267119 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.267144 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.267152 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.276917 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e6d547ace32606741d46b49e937937a0a9f8ac8c40f448e673445cd2c2f81725"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.276971 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.276985 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.276997 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.277010 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.283096 4725 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8" exitCode=0 Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.283207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.283269 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.284591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.284645 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.284658 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.285510 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.285520 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.286035 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4"} Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.286372 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.286459 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.286539 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.286771 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.286795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.286863 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.410497 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.421050 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.421126 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.421142 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:58 crc kubenswrapper[4725]: I0225 10:52:58.421181 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:52:58 crc kubenswrapper[4725]: E0225 10:52:58.421941 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.292256 4725 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953" exitCode=0 Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.293037 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.292361 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953"} Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.293919 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.292530 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.294121 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.294281 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.294556 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.294687 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.294795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.295208 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.295238 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.295252 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.295552 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.295646 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.295672 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.296198 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.296251 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:52:59 crc kubenswrapper[4725]: I0225 10:52:59.296275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.299734 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c"} Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.299853 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb"} Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.299876 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a"} Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.299891 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.299901 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.300068 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.301078 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.301124 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.301141 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.302155 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.302194 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.302217 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.552412 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.552686 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.554254 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.554311 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.554323 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:00 crc kubenswrapper[4725]: I0225 10:53:00.991936 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.309229 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e"} Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.309964 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e"} Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.309358 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.309355 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.310091 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.311510 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.311539 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.311567 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.311568 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.311634 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.311593 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.403873 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.622931 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.624291 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.624351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.624362 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:01 crc kubenswrapper[4725]: I0225 10:53:01.624396 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.015528 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.312407 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.314327 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.314415 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.314439 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.465314 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.465628 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.465699 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.467891 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.467993 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.468031 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:02 crc kubenswrapper[4725]: I0225 10:53:02.842288 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.315603 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.315603 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.317455 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.317532 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.317557 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.318322 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.318384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:03 crc kubenswrapper[4725]: I0225 10:53:03.318409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:04 crc kubenswrapper[4725]: I0225 10:53:04.860319 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:04 crc kubenswrapper[4725]: I0225 10:53:04.860667 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:04 crc kubenswrapper[4725]: I0225 10:53:04.862507 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:04 crc kubenswrapper[4725]: I0225 10:53:04.862566 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:04 crc kubenswrapper[4725]: I0225 10:53:04.862584 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:05 crc kubenswrapper[4725]: E0225 10:53:05.308371 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:53:05 crc kubenswrapper[4725]: I0225 10:53:05.516625 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:05 crc kubenswrapper[4725]: I0225 10:53:05.516949 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:05 crc kubenswrapper[4725]: I0225 10:53:05.518676 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:05 crc kubenswrapper[4725]: I0225 10:53:05.518729 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:05 crc kubenswrapper[4725]: I0225 10:53:05.518751 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.197882 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.198208 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.200141 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.200196 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.200215 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.533758 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.534119 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.536790 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.536935 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.536964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:06 crc kubenswrapper[4725]: I0225 10:53:06.541099 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:07 crc kubenswrapper[4725]: I0225 10:53:07.328817 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:07 crc kubenswrapper[4725]: I0225 10:53:07.331151 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:07 crc kubenswrapper[4725]: I0225 10:53:07.331220 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:07 crc kubenswrapper[4725]: I0225 10:53:07.331241 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:07 crc kubenswrapper[4725]: I0225 10:53:07.334680 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:07 crc kubenswrapper[4725]: I0225 10:53:07.861059 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 10:53:07 crc kubenswrapper[4725]: I0225 10:53:07.861198 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 10:53:08 crc kubenswrapper[4725]: I0225 10:53:08.331333 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:08 crc kubenswrapper[4725]: I0225 10:53:08.332433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:08 crc kubenswrapper[4725]: I0225 10:53:08.332490 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:08 crc kubenswrapper[4725]: I0225 10:53:08.332506 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.114784 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 25 10:53:09 crc kubenswrapper[4725]: W0225 10:53:09.117749 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z Feb 25 10:53:09 crc kubenswrapper[4725]: W0225 10:53:09.117852 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.117994 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.117919 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:09 crc kubenswrapper[4725]: W0225 10:53:09.120532 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.120629 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:09 crc kubenswrapper[4725]: W0225 10:53:09.121502 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.121599 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.122146 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189777de5257b8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,LastTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.123998 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.125520 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" node="crc" Feb 25 10:53:09 crc kubenswrapper[4725]: E0225 10:53:09.126077 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.126357 4725 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.126415 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.130772 4725 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.130875 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.144927 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:09Z is after 2026-02-23T05:33:13Z Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.335997 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.337753 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e6d547ace32606741d46b49e937937a0a9f8ac8c40f448e673445cd2c2f81725" exitCode=255 Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.337808 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e6d547ace32606741d46b49e937937a0a9f8ac8c40f448e673445cd2c2f81725"} Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.338031 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.339026 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.339061 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.339071 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.339793 4725 scope.go:117] "RemoveContainer" containerID="e6d547ace32606741d46b49e937937a0a9f8ac8c40f448e673445cd2c2f81725" Feb 25 10:53:09 crc kubenswrapper[4725]: I0225 10:53:09.810553 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.145143 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:10Z is after 2026-02-23T05:33:13Z Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.349092 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.349592 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.351718 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" exitCode=255 Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.351773 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db"} Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.351906 4725 scope.go:117] "RemoveContainer" containerID="e6d547ace32606741d46b49e937937a0a9f8ac8c40f448e673445cd2c2f81725" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.351942 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.353110 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.353162 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.353177 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:10 crc kubenswrapper[4725]: I0225 10:53:10.353799 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:10 crc kubenswrapper[4725]: E0225 10:53:10.354331 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.001328 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.145982 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:11Z is after 2026-02-23T05:33:13Z Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.357213 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.360217 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.361275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.361306 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.361314 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.361969 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:11 crc kubenswrapper[4725]: E0225 10:53:11.362154 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:11 crc kubenswrapper[4725]: I0225 10:53:11.366452 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.053086 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.053394 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.055262 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.055331 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.055388 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.075326 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.146535 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:12Z is after 2026-02-23T05:33:13Z Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.363548 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.364471 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.365237 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.365305 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.365330 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.366003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.366226 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.366392 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.367652 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:12 crc kubenswrapper[4725]: E0225 10:53:12.368187 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:12 crc kubenswrapper[4725]: I0225 10:53:12.842672 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:13 crc kubenswrapper[4725]: I0225 10:53:13.144115 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:13Z is after 2026-02-23T05:33:13Z Feb 25 10:53:13 crc kubenswrapper[4725]: W0225 10:53:13.294613 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:13Z is after 2026-02-23T05:33:13Z Feb 25 10:53:13 crc kubenswrapper[4725]: E0225 10:53:13.294715 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:13 crc kubenswrapper[4725]: I0225 10:53:13.366526 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:13 crc kubenswrapper[4725]: I0225 10:53:13.367997 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:13 crc kubenswrapper[4725]: I0225 10:53:13.368125 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:13 crc kubenswrapper[4725]: I0225 10:53:13.368147 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:13 crc kubenswrapper[4725]: I0225 10:53:13.369305 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:13 crc kubenswrapper[4725]: E0225 10:53:13.369555 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:13 crc kubenswrapper[4725]: W0225 10:53:13.599135 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:13Z is after 2026-02-23T05:33:13Z Feb 25 10:53:13 crc kubenswrapper[4725]: E0225 10:53:13.599242 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:14 crc kubenswrapper[4725]: I0225 10:53:14.146763 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:14Z is after 2026-02-23T05:33:13Z Feb 25 10:53:14 crc kubenswrapper[4725]: I0225 10:53:14.369962 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:14 crc kubenswrapper[4725]: I0225 10:53:14.371244 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:14 crc kubenswrapper[4725]: I0225 10:53:14.371426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:14 crc kubenswrapper[4725]: I0225 10:53:14.371555 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:14 crc kubenswrapper[4725]: I0225 10:53:14.372927 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:14 crc kubenswrapper[4725]: E0225 10:53:14.373415 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:15 crc kubenswrapper[4725]: I0225 10:53:15.144064 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:15Z is after 2026-02-23T05:33:13Z Feb 25 10:53:15 crc kubenswrapper[4725]: E0225 10:53:15.308581 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:53:15 crc kubenswrapper[4725]: E0225 10:53:15.520656 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:15Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 25 10:53:15 crc kubenswrapper[4725]: I0225 10:53:15.525794 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:15 crc kubenswrapper[4725]: I0225 10:53:15.527729 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:15 crc kubenswrapper[4725]: I0225 10:53:15.527914 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:15 crc kubenswrapper[4725]: I0225 10:53:15.527992 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:15 crc kubenswrapper[4725]: I0225 10:53:15.528077 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:15 crc kubenswrapper[4725]: E0225 10:53:15.532817 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:15Z is after 2026-02-23T05:33:13Z" node="crc" Feb 25 10:53:16 crc kubenswrapper[4725]: I0225 10:53:16.144107 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:16Z is after 2026-02-23T05:33:13Z Feb 25 10:53:17 crc kubenswrapper[4725]: W0225 10:53:17.134211 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:17Z is after 2026-02-23T05:33:13Z Feb 25 10:53:17 crc kubenswrapper[4725]: E0225 10:53:17.134325 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:17 crc kubenswrapper[4725]: I0225 10:53:17.146672 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:17Z is after 2026-02-23T05:33:13Z Feb 25 10:53:17 crc kubenswrapper[4725]: I0225 10:53:17.861259 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 10:53:17 crc kubenswrapper[4725]: I0225 10:53:17.861373 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 10:53:17 crc kubenswrapper[4725]: I0225 10:53:17.867516 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 10:53:17 crc kubenswrapper[4725]: E0225 10:53:17.874018 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:18 crc kubenswrapper[4725]: I0225 10:53:18.152229 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:18Z is after 2026-02-23T05:33:13Z Feb 25 10:53:19 crc kubenswrapper[4725]: E0225 10:53:19.127490 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:19Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189777de5257b8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,LastTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:19 crc kubenswrapper[4725]: I0225 10:53:19.146542 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:19Z is after 2026-02-23T05:33:13Z Feb 25 10:53:19 crc kubenswrapper[4725]: I0225 10:53:19.810957 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:19 crc kubenswrapper[4725]: I0225 10:53:19.812300 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:19 crc kubenswrapper[4725]: I0225 10:53:19.814066 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:19 crc kubenswrapper[4725]: I0225 10:53:19.814117 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:19 crc kubenswrapper[4725]: I0225 10:53:19.814136 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:19 crc kubenswrapper[4725]: I0225 10:53:19.815222 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:19 crc kubenswrapper[4725]: E0225 10:53:19.815594 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:20 crc kubenswrapper[4725]: I0225 10:53:20.146446 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:20Z is after 2026-02-23T05:33:13Z Feb 25 10:53:21 crc kubenswrapper[4725]: I0225 10:53:21.146424 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:21Z is after 2026-02-23T05:33:13Z Feb 25 10:53:21 crc kubenswrapper[4725]: W0225 10:53:21.784948 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:21Z is after 2026-02-23T05:33:13Z Feb 25 10:53:21 crc kubenswrapper[4725]: E0225 10:53:21.785092 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:22 crc kubenswrapper[4725]: I0225 10:53:22.145945 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:22Z is after 2026-02-23T05:33:13Z Feb 25 10:53:22 crc kubenswrapper[4725]: E0225 10:53:22.527095 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:22Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 25 10:53:22 crc kubenswrapper[4725]: I0225 10:53:22.546591 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:22 crc kubenswrapper[4725]: I0225 10:53:22.557972 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:22 crc kubenswrapper[4725]: I0225 10:53:22.558058 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:22 crc kubenswrapper[4725]: I0225 10:53:22.558092 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:22 crc kubenswrapper[4725]: I0225 10:53:22.558141 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:22 crc kubenswrapper[4725]: E0225 10:53:22.564086 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:22Z is after 2026-02-23T05:33:13Z" node="crc" Feb 25 10:53:23 crc kubenswrapper[4725]: I0225 10:53:23.146670 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:23Z is after 2026-02-23T05:33:13Z Feb 25 10:53:23 crc kubenswrapper[4725]: W0225 10:53:23.207814 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:23Z is after 2026-02-23T05:33:13Z Feb 25 10:53:23 crc kubenswrapper[4725]: E0225 10:53:23.207934 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:24 crc kubenswrapper[4725]: I0225 10:53:24.146057 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:24Z is after 2026-02-23T05:33:13Z Feb 25 10:53:25 crc kubenswrapper[4725]: I0225 10:53:25.144632 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:25Z is after 2026-02-23T05:33:13Z Feb 25 10:53:25 crc kubenswrapper[4725]: E0225 10:53:25.308729 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:53:25 crc kubenswrapper[4725]: W0225 10:53:25.857288 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:25Z is after 2026-02-23T05:33:13Z Feb 25 10:53:25 crc kubenswrapper[4725]: E0225 10:53:25.857368 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:53:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 10:53:26 crc kubenswrapper[4725]: I0225 10:53:26.148225 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.147299 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.808065 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:52206->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.808920 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:52206->192.168.126.11:10357: read: connection reset by peer" Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.809048 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.809250 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.810621 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.810671 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.810690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.811869 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 25 10:53:27 crc kubenswrapper[4725]: I0225 10:53:27.812098 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d" gracePeriod=30 Feb 25 10:53:28 crc kubenswrapper[4725]: I0225 10:53:28.147270 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:28 crc kubenswrapper[4725]: I0225 10:53:28.416689 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 10:53:28 crc kubenswrapper[4725]: I0225 10:53:28.417395 4725 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d" exitCode=255 Feb 25 10:53:28 crc kubenswrapper[4725]: I0225 10:53:28.417466 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d"} Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.133245 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de5257b8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,LastTimestamp:2026-02-25 10:52:55.141505191 +0000 UTC m=+0.640087276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.136362 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.143991 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.144006 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.147847 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e95f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,LastTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.154693 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de5ba60fd3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.297634259 +0000 UTC m=+0.796216284,LastTimestamp:2026-02-25 10:52:55.297634259 +0000 UTC m=+0.796216284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.160755 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e42ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.324602271 +0000 UTC m=+0.823184326,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.166527 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e7328\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.324638842 +0000 UTC m=+0.823220907,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.172601 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e95f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e95f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,LastTimestamp:2026-02-25 10:52:55.324656063 +0000 UTC m=+0.823238128,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.177709 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e42ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.325989675 +0000 UTC m=+0.824571700,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.183958 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e7328\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.326002196 +0000 UTC m=+0.824584221,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.190302 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e95f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e95f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,LastTimestamp:2026-02-25 10:52:55.326011286 +0000 UTC m=+0.824593311,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.198250 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e42ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.326044917 +0000 UTC m=+0.824626942,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.205566 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e7328\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.326055857 +0000 UTC m=+0.824637882,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.210515 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e95f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e95f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,LastTimestamp:2026-02-25 10:52:55.326064457 +0000 UTC m=+0.824646482,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.216273 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e42ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.326952329 +0000 UTC m=+0.825534344,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.220588 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e7328\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.326960769 +0000 UTC m=+0.825542794,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.225304 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e95f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e95f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,LastTimestamp:2026-02-25 10:52:55.326967469 +0000 UTC m=+0.825549494,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.230170 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e42ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.327462662 +0000 UTC m=+0.826044717,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.235175 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e7328\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.327487902 +0000 UTC m=+0.826069957,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.241047 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e95f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e95f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,LastTimestamp:2026-02-25 10:52:55.327507343 +0000 UTC m=+0.826089398,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.244854 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e42ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.327754219 +0000 UTC m=+0.826336244,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.249761 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e7328\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.327767019 +0000 UTC m=+0.826349044,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.255269 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e95f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e95f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209063926 +0000 UTC m=+0.707645951,LastTimestamp:2026-02-25 10:52:55.327776649 +0000 UTC m=+0.826358674,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.260210 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e42ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e42ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209042605 +0000 UTC m=+0.707624630,LastTimestamp:2026-02-25 10:52:55.327994285 +0000 UTC m=+0.826576310,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.265587 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189777de565e7328\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189777de565e7328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.209055016 +0000 UTC m=+0.707637041,LastTimestamp:2026-02-25 10:52:55.328008585 +0000 UTC m=+0.826590610,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.273268 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777de7508e228 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.723541032 +0000 UTC m=+1.222123097,LastTimestamp:2026-02-25 10:52:55.723541032 +0000 UTC m=+1.222123097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.279458 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777de7516d971 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.724456305 +0000 UTC m=+1.223038370,LastTimestamp:2026-02-25 10:52:55.724456305 +0000 UTC m=+1.223038370,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.284034 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777de75ea6133 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.738319155 +0000 UTC m=+1.236901190,LastTimestamp:2026-02-25 10:52:55.738319155 +0000 UTC m=+1.236901190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.290059 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777de764b72e9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.744680681 +0000 UTC m=+1.243262746,LastTimestamp:2026-02-25 10:52:55.744680681 +0000 UTC m=+1.243262746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.295968 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189777de76a48170 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:55.750517104 +0000 UTC m=+1.249099179,LastTimestamp:2026-02-25 10:52:55.750517104 +0000 UTC m=+1.249099179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.302481 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777de9a700047 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.351055943 +0000 UTC m=+1.849637988,LastTimestamp:2026-02-25 10:52:56.351055943 +0000 UTC m=+1.849637988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.307087 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777de9a94a3c4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.353457092 +0000 UTC m=+1.852039127,LastTimestamp:2026-02-25 10:52:56.353457092 +0000 UTC m=+1.852039127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.312659 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777de9abad3b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.355959733 +0000 UTC m=+1.854541758,LastTimestamp:2026-02-25 10:52:56.355959733 +0000 UTC m=+1.854541758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.317925 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777de9ac6c12f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.356741423 +0000 UTC m=+1.855323438,LastTimestamp:2026-02-25 10:52:56.356741423 +0000 UTC m=+1.855323438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.322963 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189777de9ae911e4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.358990308 +0000 UTC m=+1.857572333,LastTimestamp:2026-02-25 10:52:56.358990308 +0000 UTC m=+1.857572333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.326614 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777de9b42a956 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.364861782 +0000 UTC m=+1.863443827,LastTimestamp:2026-02-25 10:52:56.364861782 +0000 UTC m=+1.863443827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.330451 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777de9b53872d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.365967149 +0000 UTC m=+1.864549174,LastTimestamp:2026-02-25 10:52:56.365967149 +0000 UTC m=+1.864549174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.334365 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777de9b6ae8df openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.367499487 +0000 UTC m=+1.866081512,LastTimestamp:2026-02-25 10:52:56.367499487 +0000 UTC m=+1.866081512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.338920 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777de9b7bcb8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.368606094 +0000 UTC m=+1.867188119,LastTimestamp:2026-02-25 10:52:56.368606094 +0000 UTC m=+1.867188119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.342526 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777de9bb87d1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.372583711 +0000 UTC m=+1.871165736,LastTimestamp:2026-02-25 10:52:56.372583711 +0000 UTC m=+1.871165736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.343449 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189777de9c025088 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.37742196 +0000 UTC m=+1.876003995,LastTimestamp:2026-02-25 10:52:56.37742196 +0000 UTC m=+1.876003995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.346997 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777dead0c0082 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.663269506 +0000 UTC m=+2.161851581,LastTimestamp:2026-02-25 10:52:56.663269506 +0000 UTC m=+2.161851581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.352250 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777deae0d98a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.6801512 +0000 UTC m=+2.178733245,LastTimestamp:2026-02-25 10:52:56.6801512 +0000 UTC m=+2.178733245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.355791 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777deae2c10c7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.682148039 +0000 UTC m=+2.180730064,LastTimestamp:2026-02-25 10:52:56.682148039 +0000 UTC m=+2.180730064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.359441 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777debb190931 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.899004721 +0000 UTC m=+2.397586736,LastTimestamp:2026-02-25 10:52:56.899004721 +0000 UTC m=+2.397586736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.362924 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777debbe42668 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.912316008 +0000 UTC m=+2.410898063,LastTimestamp:2026-02-25 10:52:56.912316008 +0000 UTC m=+2.410898063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.366455 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777debc02074e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.914274126 +0000 UTC m=+2.412856181,LastTimestamp:2026-02-25 10:52:56.914274126 +0000 UTC m=+2.412856181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.370055 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777decb1bde4c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.167625804 +0000 UTC m=+2.666207829,LastTimestamp:2026-02-25 10:52:57.167625804 +0000 UTC m=+2.666207829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.373591 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777decbfc7f11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.182347025 +0000 UTC m=+2.680929050,LastTimestamp:2026-02-25 10:52:57.182347025 +0000 UTC m=+2.680929050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.379803 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189777decffb0f4f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.249361743 +0000 UTC m=+2.747943778,LastTimestamp:2026-02-25 10:52:57.249361743 +0000 UTC m=+2.747943778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.386604 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777ded0a2e8d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.260361943 +0000 UTC m=+2.758943968,LastTimestamp:2026-02-25 10:52:57.260361943 +0000 UTC m=+2.758943968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.390664 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777ded0a527f8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.260509176 +0000 UTC m=+2.759091221,LastTimestamp:2026-02-25 10:52:57.260509176 +0000 UTC m=+2.759091221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.397629 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777ded0c3ec2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.262525483 +0000 UTC m=+2.761107508,LastTimestamp:2026-02-25 10:52:57.262525483 +0000 UTC m=+2.761107508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.403693 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189777dedbfc2f31 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.450762033 +0000 UTC m=+2.949344058,LastTimestamp:2026-02-25 10:52:57.450762033 +0000 UTC m=+2.949344058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.409479 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777dedc50dd66 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.456311654 +0000 UTC m=+2.954893679,LastTimestamp:2026-02-25 10:52:57.456311654 +0000 UTC m=+2.954893679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.413685 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777dedc530729 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.456453417 +0000 UTC m=+2.955035442,LastTimestamp:2026-02-25 10:52:57.456453417 +0000 UTC m=+2.955035442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.418288 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189777dedce487ce openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.46598907 +0000 UTC m=+2.964571095,LastTimestamp:2026-02-25 10:52:57.46598907 +0000 UTC m=+2.964571095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.422695 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.423390 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2"} Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.423553 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.424444 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777dedd9264bd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.477383357 +0000 UTC m=+2.975965382,LastTimestamp:2026-02-25 10:52:57.477383357 +0000 UTC m=+2.975965382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.424661 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.424721 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.424745 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.431240 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777deddaa6bde openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.478958046 +0000 UTC m=+2.977540071,LastTimestamp:2026-02-25 10:52:57.478958046 +0000 UTC m=+2.977540071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.437209 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777deddb4ec28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.479646248 +0000 UTC m=+2.978228273,LastTimestamp:2026-02-25 10:52:57.479646248 +0000 UTC m=+2.978228273,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.442676 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777deddc91c38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.480969272 +0000 UTC m=+2.979551297,LastTimestamp:2026-02-25 10:52:57.480969272 +0000 UTC m=+2.979551297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.444940 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777dede6c87bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.491679167 +0000 UTC m=+2.990261192,LastTimestamp:2026-02-25 10:52:57.491679167 +0000 UTC m=+2.990261192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.448823 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777dee050fbc6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.523428294 +0000 UTC m=+3.022010329,LastTimestamp:2026-02-25 10:52:57.523428294 +0000 UTC m=+3.022010329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.454410 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777dee92286c8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.671378632 +0000 UTC m=+3.169960657,LastTimestamp:2026-02-25 10:52:57.671378632 +0000 UTC m=+3.169960657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.459887 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777dee96d6223 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.676284451 +0000 UTC m=+3.174866476,LastTimestamp:2026-02-25 10:52:57.676284451 +0000 UTC m=+3.174866476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.465712 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777deea661ee1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.692585697 +0000 UTC m=+3.191167742,LastTimestamp:2026-02-25 10:52:57.692585697 +0000 UTC m=+3.191167742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.474013 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777deea7d77b5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.694115765 +0000 UTC m=+3.192697790,LastTimestamp:2026-02-25 10:52:57.694115765 +0000 UTC m=+3.192697790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.479915 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777deeaf29b0c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.701792524 +0000 UTC m=+3.200374549,LastTimestamp:2026-02-25 10:52:57.701792524 +0000 UTC m=+3.200374549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.483501 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777deeb744eb3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.710292659 +0000 UTC m=+3.208874684,LastTimestamp:2026-02-25 10:52:57.710292659 +0000 UTC m=+3.208874684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.486773 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777def5a3dade openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.881180894 +0000 UTC m=+3.379762919,LastTimestamp:2026-02-25 10:52:57.881180894 +0000 UTC m=+3.379762919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.492258 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777def5dd1bbc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.884933052 +0000 UTC m=+3.383515067,LastTimestamp:2026-02-25 10:52:57.884933052 +0000 UTC m=+3.383515067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.497748 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189777def674949d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.894859933 +0000 UTC m=+3.393441958,LastTimestamp:2026-02-25 10:52:57.894859933 +0000 UTC m=+3.393441958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.503467 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777def6b7d483 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.899267203 +0000 UTC m=+3.397849228,LastTimestamp:2026-02-25 10:52:57.899267203 +0000 UTC m=+3.397849228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.510141 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777def6cc9ed6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:57.900629718 +0000 UTC m=+3.399211743,LastTimestamp:2026-02-25 10:52:57.900629718 +0000 UTC m=+3.399211743,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.516548 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777deffbfb8be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.050779326 +0000 UTC m=+3.549361351,LastTimestamp:2026-02-25 10:52:58.050779326 +0000 UTC m=+3.549361351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.523322 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777df0079abe0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.062965728 +0000 UTC m=+3.561547753,LastTimestamp:2026-02-25 10:52:58.062965728 +0000 UTC m=+3.561547753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.530329 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.530385 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777df008ecf7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.064351103 +0000 UTC m=+3.562933138,LastTimestamp:2026-02-25 10:52:58.064351103 +0000 UTC m=+3.562933138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.537011 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777df0be6eb0e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.254674702 +0000 UTC m=+3.753256717,LastTimestamp:2026-02-25 10:52:58.254674702 +0000 UTC m=+3.753256717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.544043 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777df0cf5b77a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.272421754 +0000 UTC m=+3.771003789,LastTimestamp:2026-02-25 10:52:58.272421754 +0000 UTC m=+3.771003789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.548992 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df0dc743c6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.286154694 +0000 UTC m=+3.784736719,LastTimestamp:2026-02-25 10:52:58.286154694 +0000 UTC m=+3.784736719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.557058 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df1a2643da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.493707226 +0000 UTC m=+3.992289251,LastTimestamp:2026-02-25 10:52:58.493707226 +0000 UTC m=+3.992289251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.563710 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df1af68926 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.507356454 +0000 UTC m=+4.005938479,LastTimestamp:2026-02-25 10:52:58.507356454 +0000 UTC m=+4.005938479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.564739 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.565937 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.565975 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.565989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:29 crc kubenswrapper[4725]: I0225 10:53:29.566016 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.569258 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.569498 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df4a1878c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:59.298109634 +0000 UTC m=+4.796691699,LastTimestamp:2026-02-25 10:52:59.298109634 +0000 UTC m=+4.796691699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.571099 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df580b1d92 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:59.532115346 +0000 UTC m=+5.030697411,LastTimestamp:2026-02-25 10:52:59.532115346 +0000 UTC m=+5.030697411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.576422 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df58c85214 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:59.544515092 +0000 UTC m=+5.043097147,LastTimestamp:2026-02-25 10:52:59.544515092 +0000 UTC m=+5.043097147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.580324 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df58df2789 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:59.546011529 +0000 UTC m=+5.044593594,LastTimestamp:2026-02-25 10:52:59.546011529 +0000 UTC m=+5.044593594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.586778 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df67c41f94 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:59.79589826 +0000 UTC m=+5.294480295,LastTimestamp:2026-02-25 10:52:59.79589826 +0000 UTC m=+5.294480295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.593241 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df689e1ece openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:59.81018491 +0000 UTC m=+5.308766965,LastTimestamp:2026-02-25 10:52:59.81018491 +0000 UTC m=+5.308766965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.599679 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df68bbfd09 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:59.812142345 +0000 UTC m=+5.310724380,LastTimestamp:2026-02-25 10:52:59.812142345 +0000 UTC m=+5.310724380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.605911 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df772d83cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.054463439 +0000 UTC m=+5.553045474,LastTimestamp:2026-02-25 10:53:00.054463439 +0000 UTC m=+5.553045474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.612067 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df785d2800 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.07436288 +0000 UTC m=+5.572944915,LastTimestamp:2026-02-25 10:53:00.07436288 +0000 UTC m=+5.572944915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.619049 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df7876f1a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.076052901 +0000 UTC m=+5.574634936,LastTimestamp:2026-02-25 10:53:00.076052901 +0000 UTC m=+5.574634936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.626573 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df85f197a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.302194601 +0000 UTC m=+5.800776636,LastTimestamp:2026-02-25 10:53:00.302194601 +0000 UTC m=+5.800776636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.634580 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df86b2cb81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.314856321 +0000 UTC m=+5.813438356,LastTimestamp:2026-02-25 10:53:00.314856321 +0000 UTC m=+5.813438356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.641077 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df86c4d818 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.316039192 +0000 UTC m=+5.814621227,LastTimestamp:2026-02-25 10:53:00.316039192 +0000 UTC m=+5.814621227,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.647523 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df9282c6f6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.513036022 +0000 UTC m=+6.011618077,LastTimestamp:2026-02-25 10:53:00.513036022 +0000 UTC m=+6.011618077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.654018 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189777df9384e12c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:00.52995102 +0000 UTC m=+6.028533055,LastTimestamp:2026-02-25 10:53:00.52995102 +0000 UTC m=+6.028533055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.659760 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 10:53:29 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189777e1487e3bea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 25 10:53:29 crc kubenswrapper[4725]: body: Feb 25 10:53:29 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:07.86115889 +0000 UTC m=+13.359740945,LastTimestamp:2026-02-25 10:53:07.86115889 +0000 UTC m=+13.359740945,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 10:53:29 crc kubenswrapper[4725]: > Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.666151 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777e1487fa753 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:07.861251923 +0000 UTC m=+13.359833988,LastTimestamp:2026-02-25 10:53:07.861251923 +0000 UTC m=+13.359833988,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.672889 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 25 10:53:29 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-apiserver-crc.189777e193e84aa6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 25 10:53:29 crc kubenswrapper[4725]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 10:53:29 crc kubenswrapper[4725]: Feb 25 10:53:29 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:09.126400678 +0000 UTC m=+14.624982713,LastTimestamp:2026-02-25 10:53:09.126400678 +0000 UTC m=+14.624982713,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 10:53:29 crc kubenswrapper[4725]: > Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.679736 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777e193e8ecd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:09.126442199 +0000 UTC m=+14.625024224,LastTimestamp:2026-02-25 10:53:09.126442199 +0000 UTC m=+14.625024224,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.686893 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189777e193e84aa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 25 10:53:29 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-apiserver-crc.189777e193e84aa6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 25 10:53:29 crc kubenswrapper[4725]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 10:53:29 crc kubenswrapper[4725]: Feb 25 10:53:29 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:09.126400678 +0000 UTC m=+14.624982713,LastTimestamp:2026-02-25 10:53:09.130850154 +0000 UTC m=+14.629432179,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 10:53:29 crc kubenswrapper[4725]: > Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.693939 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189777e193e8ecd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777e193e8ecd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:09.126442199 +0000 UTC m=+14.625024224,LastTimestamp:2026-02-25 10:53:09.130920686 +0000 UTC m=+14.629502711,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.699408 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189777df008ecf7f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777df008ecf7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.064351103 +0000 UTC m=+3.562933138,LastTimestamp:2026-02-25 10:53:09.341190302 +0000 UTC m=+14.839772327,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.706085 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189777df0be6eb0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777df0be6eb0e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.254674702 +0000 UTC m=+3.753256717,LastTimestamp:2026-02-25 10:53:09.519908293 +0000 UTC m=+15.018490318,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.713782 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189777df0cf5b77a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189777df0cf5b77a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:58.272421754 +0000 UTC m=+3.771003789,LastTimestamp:2026-02-25 10:53:09.529853803 +0000 UTC m=+15.028435818,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.722813 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 10:53:29 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189777e39c8ceea6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 25 10:53:29 crc kubenswrapper[4725]: body: Feb 25 10:53:29 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:17.861342886 +0000 UTC m=+23.359924941,LastTimestamp:2026-02-25 10:53:17.861342886 +0000 UTC m=+23.359924941,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 10:53:29 crc kubenswrapper[4725]: > Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.728675 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777e39c8e0d2e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:17.861416238 +0000 UTC m=+23.359998303,LastTimestamp:2026-02-25 10:53:17.861416238 +0000 UTC m=+23.359998303,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.734716 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 10:53:29 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189777e5ed786828 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:52206->192.168.126.11:10357: read: connection reset by peer Feb 25 10:53:29 crc kubenswrapper[4725]: body: Feb 25 10:53:29 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:27.808886824 +0000 UTC m=+33.307468919,LastTimestamp:2026-02-25 10:53:27.808886824 +0000 UTC m=+33.307468919,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 10:53:29 crc kubenswrapper[4725]: > Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.739554 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777e5ed7a3a9d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:52206->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:27.809006237 +0000 UTC m=+33.307588302,LastTimestamp:2026-02-25 10:53:27.809006237 +0000 UTC m=+33.307588302,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.744123 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777e5eda8f667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:27.812068967 +0000 UTC m=+33.310651042,LastTimestamp:2026-02-25 10:53:27.812068967 +0000 UTC m=+33.310651042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.750035 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189777de9b6ae8df\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777de9b6ae8df openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.367499487 +0000 UTC m=+1.866081512,LastTimestamp:2026-02-25 10:53:28.337239113 +0000 UTC m=+33.835821138,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.755558 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189777dead0c0082\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777dead0c0082 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.663269506 +0000 UTC m=+2.161851581,LastTimestamp:2026-02-25 10:53:28.57702525 +0000 UTC m=+34.075607275,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:29 crc kubenswrapper[4725]: E0225 10:53:29.760281 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189777deae0d98a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777deae0d98a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:52:56.6801512 +0000 UTC m=+2.178733245,LastTimestamp:2026-02-25 10:53:28.587699549 +0000 UTC m=+34.086281574,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:30 crc kubenswrapper[4725]: I0225 10:53:30.146190 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:30 crc kubenswrapper[4725]: I0225 10:53:30.426277 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:30 crc kubenswrapper[4725]: I0225 10:53:30.428276 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:30 crc kubenswrapper[4725]: I0225 10:53:30.428330 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:30 crc kubenswrapper[4725]: I0225 10:53:30.428341 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:31 crc kubenswrapper[4725]: I0225 10:53:31.145449 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:32 crc kubenswrapper[4725]: I0225 10:53:32.148337 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:33 crc kubenswrapper[4725]: I0225 10:53:33.148241 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:33 crc kubenswrapper[4725]: I0225 10:53:33.223472 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:33 crc kubenswrapper[4725]: I0225 10:53:33.224565 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:33 crc kubenswrapper[4725]: I0225 10:53:33.224628 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:33 crc kubenswrapper[4725]: I0225 10:53:33.224653 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:33 crc kubenswrapper[4725]: I0225 10:53:33.225552 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.148487 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.439426 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.440022 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.441724 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d137a499eb46cf9e8ff5adff6cd3051c4194610ad46e75028fc84c47094cfc15" exitCode=255 Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.441761 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d137a499eb46cf9e8ff5adff6cd3051c4194610ad46e75028fc84c47094cfc15"} Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.441794 4725 scope.go:117] "RemoveContainer" containerID="a3a4fa742170ce2b707365f9777a94590db5b99dcba6353511ffc3202998c6db" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.441978 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.443097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.443156 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.443180 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.444096 4725 scope.go:117] "RemoveContainer" containerID="d137a499eb46cf9e8ff5adff6cd3051c4194610ad46e75028fc84c47094cfc15" Feb 25 10:53:34 crc kubenswrapper[4725]: E0225 10:53:34.444400 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.860552 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.860694 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.861947 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.861977 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:34 crc kubenswrapper[4725]: I0225 10:53:34.861986 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.035214 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.051524 4725 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.147461 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:35 crc kubenswrapper[4725]: E0225 10:53:35.309073 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.446602 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.517574 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.517896 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.519362 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.519427 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:35 crc kubenswrapper[4725]: I0225 10:53:35.519452 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:36 crc kubenswrapper[4725]: I0225 10:53:36.145620 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:36 crc kubenswrapper[4725]: E0225 10:53:36.536011 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 10:53:36 crc kubenswrapper[4725]: I0225 10:53:36.569638 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:36 crc kubenswrapper[4725]: I0225 10:53:36.571348 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:36 crc kubenswrapper[4725]: I0225 10:53:36.571394 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:36 crc kubenswrapper[4725]: I0225 10:53:36.571406 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:36 crc kubenswrapper[4725]: I0225 10:53:36.571434 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:36 crc kubenswrapper[4725]: E0225 10:53:36.578423 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 10:53:37 crc kubenswrapper[4725]: I0225 10:53:37.148718 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:37 crc kubenswrapper[4725]: W0225 10:53:37.352292 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:37 crc kubenswrapper[4725]: E0225 10:53:37.352373 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 10:53:37 crc kubenswrapper[4725]: I0225 10:53:37.861631 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 10:53:37 crc kubenswrapper[4725]: I0225 10:53:37.861729 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 10:53:37 crc kubenswrapper[4725]: E0225 10:53:37.868144 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189777e39c8ceea6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 10:53:37 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189777e39c8ceea6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 25 10:53:37 crc kubenswrapper[4725]: body: Feb 25 10:53:37 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:17.861342886 +0000 UTC m=+23.359924941,LastTimestamp:2026-02-25 10:53:37.861702691 +0000 UTC m=+43.360284756,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 10:53:37 crc kubenswrapper[4725]: > Feb 25 10:53:37 crc kubenswrapper[4725]: E0225 10:53:37.874591 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189777e39c8e0d2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189777e39c8e0d2e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:53:17.861416238 +0000 UTC m=+23.359998303,LastTimestamp:2026-02-25 10:53:37.861764883 +0000 UTC m=+43.360346948,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:53:38 crc kubenswrapper[4725]: I0225 10:53:38.149049 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:38 crc kubenswrapper[4725]: W0225 10:53:38.584590 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 25 10:53:38 crc kubenswrapper[4725]: E0225 10:53:38.584676 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 10:53:39 crc kubenswrapper[4725]: I0225 10:53:39.147573 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:39 crc kubenswrapper[4725]: I0225 10:53:39.810435 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:39 crc kubenswrapper[4725]: I0225 10:53:39.810622 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:39 crc kubenswrapper[4725]: I0225 10:53:39.812103 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:39 crc kubenswrapper[4725]: I0225 10:53:39.812193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:39 crc kubenswrapper[4725]: I0225 10:53:39.812224 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:39 crc kubenswrapper[4725]: I0225 10:53:39.813010 4725 scope.go:117] "RemoveContainer" containerID="d137a499eb46cf9e8ff5adff6cd3051c4194610ad46e75028fc84c47094cfc15" Feb 25 10:53:39 crc kubenswrapper[4725]: E0225 10:53:39.813273 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:40 crc kubenswrapper[4725]: I0225 10:53:40.145200 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:41 crc kubenswrapper[4725]: I0225 10:53:41.147243 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:41 crc kubenswrapper[4725]: W0225 10:53:41.716308 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 25 10:53:41 crc kubenswrapper[4725]: E0225 10:53:41.716399 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 10:53:42 crc kubenswrapper[4725]: I0225 10:53:42.148899 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:42 crc kubenswrapper[4725]: I0225 10:53:42.842964 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:53:42 crc kubenswrapper[4725]: I0225 10:53:42.843193 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:42 crc kubenswrapper[4725]: I0225 10:53:42.844962 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:42 crc kubenswrapper[4725]: I0225 10:53:42.845024 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:42 crc kubenswrapper[4725]: I0225 10:53:42.845034 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:42 crc kubenswrapper[4725]: I0225 10:53:42.845622 4725 scope.go:117] "RemoveContainer" containerID="d137a499eb46cf9e8ff5adff6cd3051c4194610ad46e75028fc84c47094cfc15" Feb 25 10:53:42 crc kubenswrapper[4725]: E0225 10:53:42.845802 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:43 crc kubenswrapper[4725]: I0225 10:53:43.147299 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:43 crc kubenswrapper[4725]: E0225 10:53:43.541583 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 10:53:43 crc kubenswrapper[4725]: I0225 10:53:43.578669 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:43 crc kubenswrapper[4725]: I0225 10:53:43.580595 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:43 crc kubenswrapper[4725]: I0225 10:53:43.580651 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:43 crc kubenswrapper[4725]: I0225 10:53:43.580668 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:43 crc kubenswrapper[4725]: I0225 10:53:43.580707 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:43 crc kubenswrapper[4725]: E0225 10:53:43.583795 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 10:53:44 crc kubenswrapper[4725]: I0225 10:53:44.145537 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:45 crc kubenswrapper[4725]: I0225 10:53:45.145580 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:45 crc kubenswrapper[4725]: E0225 10:53:45.309316 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:53:45 crc kubenswrapper[4725]: I0225 10:53:45.972072 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:45 crc kubenswrapper[4725]: I0225 10:53:45.972328 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:45 crc kubenswrapper[4725]: I0225 10:53:45.974010 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:45 crc kubenswrapper[4725]: I0225 10:53:45.974063 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:45 crc kubenswrapper[4725]: I0225 10:53:45.974086 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:45 crc kubenswrapper[4725]: I0225 10:53:45.977193 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:53:46 crc kubenswrapper[4725]: I0225 10:53:46.148040 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:46 crc kubenswrapper[4725]: I0225 10:53:46.479002 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:46 crc kubenswrapper[4725]: I0225 10:53:46.480124 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:46 crc kubenswrapper[4725]: I0225 10:53:46.480173 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:46 crc kubenswrapper[4725]: I0225 10:53:46.480184 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:47 crc kubenswrapper[4725]: I0225 10:53:47.147512 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:48 crc kubenswrapper[4725]: I0225 10:53:48.047498 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 10:53:48 crc kubenswrapper[4725]: I0225 10:53:48.047660 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:48 crc kubenswrapper[4725]: I0225 10:53:48.048700 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:48 crc kubenswrapper[4725]: I0225 10:53:48.048730 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:48 crc kubenswrapper[4725]: I0225 10:53:48.048740 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:48 crc kubenswrapper[4725]: I0225 10:53:48.149520 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:49 crc kubenswrapper[4725]: I0225 10:53:49.145753 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:50 crc kubenswrapper[4725]: I0225 10:53:50.147357 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:50 crc kubenswrapper[4725]: E0225 10:53:50.547584 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 10:53:50 crc kubenswrapper[4725]: I0225 10:53:50.584382 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:50 crc kubenswrapper[4725]: I0225 10:53:50.586635 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:50 crc kubenswrapper[4725]: I0225 10:53:50.586682 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:50 crc kubenswrapper[4725]: I0225 10:53:50.586694 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:50 crc kubenswrapper[4725]: I0225 10:53:50.586724 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:50 crc kubenswrapper[4725]: E0225 10:53:50.594632 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 10:53:50 crc kubenswrapper[4725]: W0225 10:53:50.848181 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 25 10:53:50 crc kubenswrapper[4725]: E0225 10:53:50.848303 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 10:53:51 crc kubenswrapper[4725]: I0225 10:53:51.145919 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:52 crc kubenswrapper[4725]: I0225 10:53:52.145324 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:53 crc kubenswrapper[4725]: I0225 10:53:53.147452 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:54 crc kubenswrapper[4725]: I0225 10:53:54.145804 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:55 crc kubenswrapper[4725]: I0225 10:53:55.147469 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:55 crc kubenswrapper[4725]: E0225 10:53:55.309897 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:53:56 crc kubenswrapper[4725]: I0225 10:53:56.144540 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:57 crc kubenswrapper[4725]: I0225 10:53:57.148484 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:57 crc kubenswrapper[4725]: E0225 10:53:57.554565 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 10:53:57 crc kubenswrapper[4725]: I0225 10:53:57.594963 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:57 crc kubenswrapper[4725]: I0225 10:53:57.596440 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:57 crc kubenswrapper[4725]: I0225 10:53:57.596477 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:57 crc kubenswrapper[4725]: I0225 10:53:57.596487 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:57 crc kubenswrapper[4725]: I0225 10:53:57.596515 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:53:57 crc kubenswrapper[4725]: E0225 10:53:57.602281 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.146151 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.223660 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.225188 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.225268 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.225294 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.226333 4725 scope.go:117] "RemoveContainer" containerID="d137a499eb46cf9e8ff5adff6cd3051c4194610ad46e75028fc84c47094cfc15" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.513950 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.515870 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2"} Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.516048 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.517204 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.517241 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:58 crc kubenswrapper[4725]: I0225 10:53:58.517252 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.146347 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.520219 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.521594 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.523796 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" exitCode=255 Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.523862 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2"} Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.523929 4725 scope.go:117] "RemoveContainer" containerID="d137a499eb46cf9e8ff5adff6cd3051c4194610ad46e75028fc84c47094cfc15" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.524067 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.524964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.525002 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.525016 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.525686 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:53:59 crc kubenswrapper[4725]: E0225 10:53:59.525897 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:53:59 crc kubenswrapper[4725]: I0225 10:53:59.810711 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.150104 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.163531 4725 csr.go:261] certificate signing request csr-ggnz8 is approved, waiting to be issued Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.172379 4725 csr.go:257] certificate signing request csr-ggnz8 is issued Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.183310 4725 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.223754 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.225493 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.225528 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.225539 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.526988 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.529471 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.530272 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.530386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.530457 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.531107 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:54:00 crc kubenswrapper[4725]: E0225 10:54:00.531355 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:54:00 crc kubenswrapper[4725]: I0225 10:54:00.970077 4725 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 25 10:54:01 crc kubenswrapper[4725]: I0225 10:54:01.174347 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-21 07:42:27.960018849 +0000 UTC Feb 25 10:54:01 crc kubenswrapper[4725]: I0225 10:54:01.174418 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6452h48m26.785605529s for next certificate rotation Feb 25 10:54:02 crc kubenswrapper[4725]: I0225 10:54:02.843330 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:54:02 crc kubenswrapper[4725]: I0225 10:54:02.843579 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:54:02 crc kubenswrapper[4725]: I0225 10:54:02.844999 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:02 crc kubenswrapper[4725]: I0225 10:54:02.845075 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:02 crc kubenswrapper[4725]: I0225 10:54:02.845095 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:02 crc kubenswrapper[4725]: I0225 10:54:02.846211 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:54:02 crc kubenswrapper[4725]: E0225 10:54:02.846530 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.184353 4725 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.602920 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.604432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.604483 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.604502 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.604634 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.612624 4725 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.612740 4725 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.612761 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.616884 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.616954 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.616968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.616989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.617002 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:04Z","lastTransitionTime":"2026-02-25T10:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.633566 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.637870 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.637919 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.637938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.637966 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.637983 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:04Z","lastTransitionTime":"2026-02-25T10:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.650033 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.662692 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.662731 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.662742 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.662763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.662775 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:04Z","lastTransitionTime":"2026-02-25T10:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.675160 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.679503 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.679542 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.679552 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.679569 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:04 crc kubenswrapper[4725]: I0225 10:54:04.679578 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:04Z","lastTransitionTime":"2026-02-25T10:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.690163 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.690319 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.690357 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.790535 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.891200 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:04 crc kubenswrapper[4725]: E0225 10:54:04.992393 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.093205 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.194041 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.295616 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.310910 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.396082 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.496702 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.597715 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.698649 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.799763 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:05 crc kubenswrapper[4725]: E0225 10:54:05.900579 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.000841 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.101352 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.201694 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.302724 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.403075 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.504049 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.604219 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.705176 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.806157 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:06 crc kubenswrapper[4725]: E0225 10:54:06.906984 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.007270 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.108070 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.208755 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.309427 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.409974 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.510316 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.610803 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.711414 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.812525 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:07 crc kubenswrapper[4725]: E0225 10:54:07.913423 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.014286 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.115005 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.216081 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.316571 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.417048 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.517581 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.617726 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.718907 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.819196 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:08 crc kubenswrapper[4725]: E0225 10:54:08.920312 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.021274 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.121462 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.221591 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.322193 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.423416 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.524061 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.624508 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.725550 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.826632 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:09 crc kubenswrapper[4725]: E0225 10:54:09.927630 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:10 crc kubenswrapper[4725]: E0225 10:54:10.027903 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:10 crc kubenswrapper[4725]: E0225 10:54:10.128969 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.171947 4725 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.232698 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.232734 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.232742 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.232757 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.232767 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.337440 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.337803 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.337875 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.337907 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.337927 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.441467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.441552 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.441569 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.441592 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.441610 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.545619 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.545684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.545699 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.545724 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.545741 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.648986 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.649027 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.649038 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.649055 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.649067 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.751729 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.751766 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.751777 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.751792 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.751804 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.853890 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.853968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.854005 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.854037 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.854059 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.957569 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.957702 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.957731 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.957804 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:10 crc kubenswrapper[4725]: I0225 10:54:10.957873 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:10Z","lastTransitionTime":"2026-02-25T10:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.061355 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.061409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.061426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.061447 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.061458 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.164531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.164607 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.164630 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.164656 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.164674 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.168770 4725 apiserver.go:52] "Watching apiserver" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.176277 4725 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.176988 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9989l","openshift-multus/network-metrics-daemon-7k279","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-6klc9","openshift-image-registry/node-ca-8zw9d","openshift-multus/multus-additional-cni-plugins-9mhzp","openshift-multus/multus-d6b9f","openshift-network-diagnostics/network-check-target-xd92c","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj","openshift-machine-config-operator/machine-config-daemon-256sf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.177469 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.177479 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.177745 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.177888 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.177954 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.177972 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.178178 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.178305 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.178367 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.178445 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.178556 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.178619 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.178742 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.179079 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.181222 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.181295 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.181366 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.181776 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.182466 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.182780 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.184454 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.185162 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.187114 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.187443 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.187680 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.187773 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.187891 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.188075 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.188257 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.188392 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.188581 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.188755 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.188981 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.189246 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.189261 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.189404 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.189890 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.189898 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.190381 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.190532 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.190715 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.190884 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.190999 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.191948 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.192066 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.193116 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.194110 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.196902 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.197284 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.198648 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.198931 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.198954 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.198932 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.198931 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.199053 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.211310 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.224322 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.233613 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.244894 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.255138 4725 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.257027 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.257335 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.257501 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.257626 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.257735 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.257869 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258024 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258534 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258664 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258764 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258990 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259162 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259270 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259383 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259507 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259616 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258352 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.260069 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.260107 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259983 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258411 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.258932 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259131 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259176 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.260358 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259203 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259239 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259920 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.260479 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.259961 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.261578 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.261865 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.262032 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.262653 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263318 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263347 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263411 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263440 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263466 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263493 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263523 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263549 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263475 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263575 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263779 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263803 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263839 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263861 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263887 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263954 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.263988 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264012 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264024 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264038 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264064 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264071 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264084 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264104 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264123 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264146 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264164 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264185 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264191 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264206 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264231 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264252 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264271 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264290 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264310 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264330 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264349 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264369 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264393 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264394 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264415 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264436 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264461 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264485 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264507 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264527 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264547 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264564 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264586 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264606 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264624 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264644 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264663 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264683 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264704 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264728 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264749 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264768 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264785 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264803 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264821 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264857 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264878 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264897 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264913 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264932 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264951 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264970 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264989 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265006 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265025 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265131 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265152 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265169 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265213 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265232 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265251 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265269 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265288 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265308 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265324 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265344 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265369 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265388 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265404 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265419 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265439 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265457 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265474 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265489 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265505 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265521 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265539 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265559 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265578 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265595 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265623 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265642 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265659 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265675 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265694 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265711 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265865 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265886 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265904 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265922 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265941 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265958 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265978 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265995 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266014 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266031 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266047 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266065 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266083 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266102 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266141 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266162 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266180 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266207 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266227 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266244 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266260 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266278 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266295 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266311 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266331 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266376 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266394 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266411 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266427 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266445 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266461 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266480 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266504 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266525 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266544 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266568 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266587 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266604 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266621 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266637 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266653 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266671 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266691 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266708 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266726 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266746 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266763 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266783 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266807 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266841 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266878 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266903 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266923 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266941 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266959 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266979 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266997 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267013 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267030 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267049 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267066 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267084 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267101 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267120 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267138 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267157 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267177 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267194 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267214 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267233 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267254 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267274 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267291 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267309 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267326 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267342 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267363 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267396 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267416 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267434 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267453 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267471 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267554 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r45dq\" (UniqueName: \"kubernetes.io/projected/0c8d8877-1961-407f-b4a7-66e55321a6eb-kube-api-access-r45dq\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267579 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-os-release\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267599 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-kubelet\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267618 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovn-node-metrics-cert\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267636 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-os-release\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267688 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267710 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267731 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267751 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f769618-965f-430a-8f67-e1ef4d94a063-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267770 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-k8s-cni-cncf-io\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267786 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-netns\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267805 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-conf-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267824 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-netd\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267857 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-log-socket\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267878 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267900 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267921 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4742f60-e555-4f96-be12-b9e46a857bd4-mcd-auth-proxy-config\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267936 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c8d8877-1961-407f-b4a7-66e55321a6eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267956 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f769618-965f-430a-8f67-e1ef4d94a063-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267975 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-system-cni-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267993 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268015 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268034 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-kubelet\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268051 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-systemd\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268070 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hct4s\" (UniqueName: \"kubernetes.io/projected/07a39624-e0d8-44dc-9596-cd7224f58d5d-kube-api-access-hct4s\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268093 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268110 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4742f60-e555-4f96-be12-b9e46a857bd4-proxy-tls\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268126 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-system-cni-dir\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268146 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-cni-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268166 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-multus-certs\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268187 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268205 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-cnibin\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268224 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268241 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f769618-965f-430a-8f67-e1ef4d94a063-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268260 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-etc-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268277 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-bin\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268293 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-script-lib\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268309 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c4742f60-e555-4f96-be12-b9e46a857bd4-rootfs\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268343 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-cni-multus\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268557 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-etc-kubernetes\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268584 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4a262bc-bc77-471f-91d7-58fb221fa404-serviceca\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268607 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-slash\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268632 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-netns\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268656 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-config\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268682 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-socket-dir-parent\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268711 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwml6\" (UniqueName: \"kubernetes.io/projected/7fb276f6-5e43-4b04-a290-42bfdc3b1125-kube-api-access-zwml6\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268729 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-var-lib-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268749 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7fb276f6-5e43-4b04-a290-42bfdc3b1125-cni-binary-copy\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268769 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvjr4\" (UniqueName: \"kubernetes.io/projected/b4a262bc-bc77-471f-91d7-58fb221fa404-kube-api-access-dvjr4\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268785 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268805 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-ovn\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268858 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268892 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268926 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268945 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnp2c\" (UniqueName: \"kubernetes.io/projected/9de69f49-3e33-4721-9fee-ad2fc45b16bf-kube-api-access-rnp2c\") pod \"node-resolver-9989l\" (UID: \"9de69f49-3e33-4721-9fee-ad2fc45b16bf\") " pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268962 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-daemon-config\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268979 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4a262bc-bc77-471f-91d7-58fb221fa404-host\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268997 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-systemd-units\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269012 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-env-overrides\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269030 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lwc\" (UniqueName: \"kubernetes.io/projected/708f426f-f477-476b-92eb-7ab94a133335-kube-api-access-v7lwc\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269054 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269071 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269091 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c8d8877-1961-407f-b4a7-66e55321a6eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269110 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9de69f49-3e33-4721-9fee-ad2fc45b16bf-hosts-file\") pod \"node-resolver-9989l\" (UID: \"9de69f49-3e33-4721-9fee-ad2fc45b16bf\") " pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269128 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ngx\" (UniqueName: \"kubernetes.io/projected/8f769618-965f-430a-8f67-e1ef4d94a063-kube-api-access-d7ngx\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269146 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-cnibin\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269161 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-hostroot\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269179 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269203 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269220 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mbpj\" (UniqueName: \"kubernetes.io/projected/c4742f60-e555-4f96-be12-b9e46a857bd4-kube-api-access-9mbpj\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269243 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269261 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-cni-bin\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-node-log\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269369 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269383 4725 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269395 4725 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269407 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269420 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269433 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269444 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269455 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269466 4725 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269478 4725 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269489 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269500 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269511 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269522 4725 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269535 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269547 4725 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269557 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269570 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269582 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269594 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269606 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269617 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269629 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269640 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264677 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264844 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264868 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264798 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.264966 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265189 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265230 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265278 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265558 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265581 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.265676 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266005 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266066 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266104 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266183 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266352 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266506 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266849 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266687 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.266929 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267149 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267228 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267485 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267521 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267793 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.267942 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268094 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268309 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268324 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.268801 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269085 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269313 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.269784 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.270025 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.270261 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.270638 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.271059 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.271764 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.271800 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.272123 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.272138 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.272311 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.272558 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.272639 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.272705 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.272982 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.273255 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.273467 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.273474 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.273506 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274250 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274350 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274446 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274515 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274588 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274774 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274812 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274837 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274868 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.274895 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.275019 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.275112 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.275197 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.275242 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.275484 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.275765 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276207 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276248 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276293 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276329 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276340 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276361 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276374 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.276974 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.277151 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.277605 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.277666 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.278376 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.278505 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.278942 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.279065 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.279174 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:11.779145305 +0000 UTC m=+77.277727350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.279552 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.279899 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.279961 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.279969 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.280240 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.280262 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.280343 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.280378 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.280744 4725 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.280876 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.280896 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.281189 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.281300 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.281418 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.281215 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.281477 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.283078 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.285945 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.286012 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:11.785991414 +0000 UTC m=+77.284573439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.287285 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.287508 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:54:11.787497833 +0000 UTC m=+77.286079858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.288989 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.299076 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.301315 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.301559 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.301800 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.302408 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.303047 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.313730 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.314579 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.314608 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.314797 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.314929 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.314958 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.305533 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.291819 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.305297 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.305732 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.305757 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.305932 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.306439 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.306555 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.306821 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.307163 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.307171 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.308964 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.309186 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.309312 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.309858 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.309911 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.310083 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.310308 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.310330 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.310470 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.310653 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.310855 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.310936 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311044 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311206 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311177 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311232 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311244 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311309 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311536 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311562 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.311674 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.312132 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.312145 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.312167 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.314139 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.314235 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.315959 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.316078 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316320 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316340 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316358 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316452 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:11.816411219 +0000 UTC m=+77.314993244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316529 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316547 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316571 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.316640 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:11.816618584 +0000 UTC m=+77.315200609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.316956 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.303080 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.320250 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.320299 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.320330 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.320327 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.320365 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.320431 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.320723 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.321141 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.323506 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.330138 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.330505 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.330968 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.331048 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.331716 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.332622 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.332900 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.332899 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.333023 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.333377 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.333945 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.333978 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334011 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334026 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334338 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334445 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334447 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334573 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334709 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.334884 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.335033 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.335218 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.341999 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.348168 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.358216 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.361218 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.369528 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370300 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovn-node-metrics-cert\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370340 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-os-release\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370385 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f769618-965f-430a-8f67-e1ef4d94a063-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370405 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-k8s-cni-cncf-io\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370420 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-netns\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370437 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-conf-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370456 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-netd\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370472 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370487 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-log-socket\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370511 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4742f60-e555-4f96-be12-b9e46a857bd4-mcd-auth-proxy-config\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370527 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c8d8877-1961-407f-b4a7-66e55321a6eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370545 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f769618-965f-430a-8f67-e1ef4d94a063-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370560 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-system-cni-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370577 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370591 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-kubelet\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370605 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-systemd\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370619 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hct4s\" (UniqueName: \"kubernetes.io/projected/07a39624-e0d8-44dc-9596-cd7224f58d5d-kube-api-access-hct4s\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370614 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-netd\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370644 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4742f60-e555-4f96-be12-b9e46a857bd4-proxy-tls\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370683 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-netns\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370721 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-system-cni-dir\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370752 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-cni-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370805 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-multus-certs\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370800 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370864 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370886 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-cnibin\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370905 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370947 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f769618-965f-430a-8f67-e1ef4d94a063-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370972 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-etc-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.370990 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-bin\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371011 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-os-release\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371036 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-script-lib\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371064 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-cni-multus\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371066 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-log-socket\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.371126 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.371169 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:11.87115457 +0000 UTC m=+77.369736595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371220 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-cnibin\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371241 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-etc-kubernetes\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4a262bc-bc77-471f-91d7-58fb221fa404-serviceca\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372201 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-slash\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372242 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-netns\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372266 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-config\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372281 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c8d8877-1961-407f-b4a7-66e55321a6eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371542 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f769618-965f-430a-8f67-e1ef4d94a063-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371647 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-cni-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-etc-kubernetes\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371672 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-multus-certs\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371694 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371939 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4742f60-e555-4f96-be12-b9e46a857bd4-mcd-auth-proxy-config\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371716 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-cni-multus\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372108 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372288 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c4742f60-e555-4f96-be12-b9e46a857bd4-rootfs\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372445 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-socket-dir-parent\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372463 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwml6\" (UniqueName: \"kubernetes.io/projected/7fb276f6-5e43-4b04-a290-42bfdc3b1125-kube-api-access-zwml6\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372480 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-var-lib-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372501 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7fb276f6-5e43-4b04-a290-42bfdc3b1125-cni-binary-copy\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372527 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvjr4\" (UniqueName: \"kubernetes.io/projected/b4a262bc-bc77-471f-91d7-58fb221fa404-kube-api-access-dvjr4\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372550 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371409 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f769618-965f-430a-8f67-e1ef4d94a063-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372583 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-netns\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372573 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-ovn\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372605 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-ovn\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372654 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnp2c\" (UniqueName: \"kubernetes.io/projected/9de69f49-3e33-4721-9fee-ad2fc45b16bf-kube-api-access-rnp2c\") pod \"node-resolver-9989l\" (UID: \"9de69f49-3e33-4721-9fee-ad2fc45b16bf\") " pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372677 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-daemon-config\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372697 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-systemd-units\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372716 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-env-overrides\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lwc\" (UniqueName: \"kubernetes.io/projected/708f426f-f477-476b-92eb-7ab94a133335-kube-api-access-v7lwc\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372755 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4a262bc-bc77-471f-91d7-58fb221fa404-host\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372775 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372795 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c8d8877-1961-407f-b4a7-66e55321a6eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372812 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9de69f49-3e33-4721-9fee-ad2fc45b16bf-hosts-file\") pod \"node-resolver-9989l\" (UID: \"9de69f49-3e33-4721-9fee-ad2fc45b16bf\") " pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372852 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ngx\" (UniqueName: \"kubernetes.io/projected/8f769618-965f-430a-8f67-e1ef4d94a063-kube-api-access-d7ngx\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372871 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-cnibin\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372889 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-hostroot\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372911 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372933 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mbpj\" (UniqueName: \"kubernetes.io/projected/c4742f60-e555-4f96-be12-b9e46a857bd4-kube-api-access-9mbpj\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372948 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-var-lib-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372952 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-cni-bin\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372975 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-cni-bin\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372993 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-node-log\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373019 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r45dq\" (UniqueName: \"kubernetes.io/projected/0c8d8877-1961-407f-b4a7-66e55321a6eb-kube-api-access-r45dq\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374002 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-systemd-units\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371444 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-bin\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372316 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c4742f60-e555-4f96-be12-b9e46a857bd4-rootfs\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374024 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-os-release\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374061 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-kubelet\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374151 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c8d8877-1961-407f-b4a7-66e55321a6eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371310 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-system-cni-dir\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372654 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-socket-dir-parent\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373680 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4a262bc-bc77-471f-91d7-58fb221fa404-serviceca\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373701 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4a262bc-bc77-471f-91d7-58fb221fa404-host\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373728 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373979 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-daemon-config\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373092 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-etc-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374333 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-env-overrides\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.372551 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-script-lib\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.371288 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-multus-conf-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373063 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-config\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374521 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-kubelet\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374543 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c8d8877-1961-407f-b4a7-66e55321a6eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.373337 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-slash\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374568 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-openvswitch\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374580 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9de69f49-3e33-4721-9fee-ad2fc45b16bf-hosts-file\") pod \"node-resolver-9989l\" (UID: \"9de69f49-3e33-4721-9fee-ad2fc45b16bf\") " pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374581 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374617 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-node-log\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374636 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-var-lib-kubelet\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374668 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-system-cni-dir\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374846 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-cnibin\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374869 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-hostroot\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.374994 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-systemd\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375045 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-os-release\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375234 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375298 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375316 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375329 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375342 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375353 4725 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375363 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375375 4725 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375387 4725 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375398 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375407 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375417 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375428 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375437 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375448 4725 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375458 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375468 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375478 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375489 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375499 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375510 4725 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375522 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375532 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375542 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375552 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375563 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375574 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375586 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375595 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375606 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375620 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375633 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375651 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375668 4725 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375683 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375693 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375703 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375713 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375725 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375737 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375752 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375769 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375781 4725 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375792 4725 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375803 4725 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375813 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375887 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375900 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375913 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375926 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375939 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375954 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375967 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375979 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375992 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376008 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376022 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376035 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376070 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376116 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.375272 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7fb276f6-5e43-4b04-a290-42bfdc3b1125-cni-binary-copy\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376131 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376175 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376187 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376200 4725 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376210 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376220 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376231 4725 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376242 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376252 4725 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376264 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376277 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376287 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376297 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376307 4725 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376318 4725 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376329 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376340 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376350 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376360 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376373 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376385 4725 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376398 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376409 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376422 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376433 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376444 4725 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376455 4725 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376465 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376476 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376487 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376498 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376509 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.376519 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391401 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391421 4725 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391436 4725 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.377922 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7fb276f6-5e43-4b04-a290-42bfdc3b1125-host-run-k8s-cni-cncf-io\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.383821 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4742f60-e555-4f96-be12-b9e46a857bd4-proxy-tls\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.390268 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvjr4\" (UniqueName: \"kubernetes.io/projected/b4a262bc-bc77-471f-91d7-58fb221fa404-kube-api-access-dvjr4\") pod \"node-ca-8zw9d\" (UID: \"b4a262bc-bc77-471f-91d7-58fb221fa404\") " pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.390427 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391456 4725 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391570 4725 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391585 4725 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391596 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391607 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391617 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391630 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391640 4725 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391650 4725 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.384225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.378161 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovn-node-metrics-cert\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391660 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391685 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391693 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391698 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391705 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391718 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391732 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391719 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391947 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391960 4725 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391975 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391986 4725 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.391998 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392009 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392023 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392035 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392049 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392061 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392073 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392085 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392098 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392143 4725 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392156 4725 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392168 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392181 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392193 4725 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392428 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392444 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392452 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392464 4725 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392472 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392480 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392490 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392501 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392513 4725 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392523 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392536 4725 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392547 4725 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392555 4725 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392563 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392571 4725 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392580 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392590 4725 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392602 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392615 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392627 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392639 4725 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392648 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392657 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392666 4725 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392674 4725 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392686 4725 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392697 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392709 4725 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392721 4725 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392738 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392749 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392759 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392771 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392783 4725 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392795 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392809 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392839 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392852 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392864 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392876 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392889 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392901 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392913 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.392925 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.393521 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnp2c\" (UniqueName: \"kubernetes.io/projected/9de69f49-3e33-4721-9fee-ad2fc45b16bf-kube-api-access-rnp2c\") pod \"node-resolver-9989l\" (UID: \"9de69f49-3e33-4721-9fee-ad2fc45b16bf\") " pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.393970 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mbpj\" (UniqueName: \"kubernetes.io/projected/c4742f60-e555-4f96-be12-b9e46a857bd4-kube-api-access-9mbpj\") pod \"machine-config-daemon-256sf\" (UID: \"c4742f60-e555-4f96-be12-b9e46a857bd4\") " pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.394561 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f769618-965f-430a-8f67-e1ef4d94a063-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.398762 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwml6\" (UniqueName: \"kubernetes.io/projected/7fb276f6-5e43-4b04-a290-42bfdc3b1125-kube-api-access-zwml6\") pod \"multus-d6b9f\" (UID: \"7fb276f6-5e43-4b04-a290-42bfdc3b1125\") " pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.398807 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hct4s\" (UniqueName: \"kubernetes.io/projected/07a39624-e0d8-44dc-9596-cd7224f58d5d-kube-api-access-hct4s\") pod \"ovnkube-node-6klc9\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.399461 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ngx\" (UniqueName: \"kubernetes.io/projected/8f769618-965f-430a-8f67-e1ef4d94a063-kube-api-access-d7ngx\") pod \"ovnkube-control-plane-749d76644c-rtvsj\" (UID: \"8f769618-965f-430a-8f67-e1ef4d94a063\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.402273 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.402593 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r45dq\" (UniqueName: \"kubernetes.io/projected/0c8d8877-1961-407f-b4a7-66e55321a6eb-kube-api-access-r45dq\") pod \"multus-additional-cni-plugins-9mhzp\" (UID: \"0c8d8877-1961-407f-b4a7-66e55321a6eb\") " pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.407274 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lwc\" (UniqueName: \"kubernetes.io/projected/708f426f-f477-476b-92eb-7ab94a133335-kube-api-access-v7lwc\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.414736 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.426385 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.494674 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.494739 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.494749 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.494772 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.494788 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.511038 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.520861 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.528465 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 25 10:54:11 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: source /etc/kubernetes/apiserver-url.env Feb 25 10:54:11 crc kubenswrapper[4725]: else Feb 25 10:54:11 crc kubenswrapper[4725]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 25 10:54:11 crc kubenswrapper[4725]: exit 1 Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.529654 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.531992 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 10:54:11 crc kubenswrapper[4725]: W0225 10:54:11.533590 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9d01c23f9cd934a5abecc2972251193901878f38cb497fb58b3cb977d01a72df WatchSource:0}: Error finding container 9d01c23f9cd934a5abecc2972251193901878f38cb497fb58b3cb977d01a72df: Status 404 returned error can't find the container with id 9d01c23f9cd934a5abecc2972251193901878f38cb497fb58b3cb977d01a72df Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.537941 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d6b9f" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.538065 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: source "/env/_master" Feb 25 10:54:11 crc kubenswrapper[4725]: set +o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 25 10:54:11 crc kubenswrapper[4725]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 25 10:54:11 crc kubenswrapper[4725]: ho_enable="--enable-hybrid-overlay" Feb 25 10:54:11 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 25 10:54:11 crc kubenswrapper[4725]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 25 10:54:11 crc kubenswrapper[4725]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --webhook-host=127.0.0.1 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --webhook-port=9743 \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${ho_enable} \ Feb 25 10:54:11 crc kubenswrapper[4725]: --enable-interconnect \ Feb 25 10:54:11 crc kubenswrapper[4725]: --disable-approver \ Feb 25 10:54:11 crc kubenswrapper[4725]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --wait-for-kubernetes-api=200s \ Feb 25 10:54:11 crc kubenswrapper[4725]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.540982 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: source "/env/_master" Feb 25 10:54:11 crc kubenswrapper[4725]: set +o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --disable-webhook \ Feb 25 10:54:11 crc kubenswrapper[4725]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.542543 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.546050 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.547638 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.549799 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.554501 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.560995 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b8ebc18b52a9923909b9b3776c6a0b7e48515a1b4c16846e6390f725998f2df8"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.562220 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4672e1f08dee936e284fd77cf2363a5ef6793cb288511d30d726455d39881202"} Feb 25 10:54:11 crc kubenswrapper[4725]: W0225 10:54:11.564539 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fb276f6_5e43_4b04_a290_42bfdc3b1125.slice/crio-145aae69f4701de751c5b7e24ad1f3e338ca319950789fd6bab0db59c48ead5a WatchSource:0}: Error finding container 145aae69f4701de751c5b7e24ad1f3e338ca319950789fd6bab0db59c48ead5a: Status 404 returned error can't find the container with id 145aae69f4701de751c5b7e24ad1f3e338ca319950789fd6bab0db59c48ead5a Feb 25 10:54:11 crc kubenswrapper[4725]: W0225 10:54:11.565463 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f769618_965f_430a_8f67_e1ef4d94a063.slice/crio-8bdba2bccee6f5b4a8d3a2d48dbc75560ea2d6ae4657e643f38ddf6a74c3dbb9 WatchSource:0}: Error finding container 8bdba2bccee6f5b4a8d3a2d48dbc75560ea2d6ae4657e643f38ddf6a74c3dbb9: Status 404 returned error can't find the container with id 8bdba2bccee6f5b4a8d3a2d48dbc75560ea2d6ae4657e643f38ddf6a74c3dbb9 Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.566428 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9d01c23f9cd934a5abecc2972251193901878f38cb497fb58b3cb977d01a72df"} Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.567243 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.568412 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.571549 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Feb 25 10:54:11 crc kubenswrapper[4725]: set -euo pipefail Feb 25 10:54:11 crc kubenswrapper[4725]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Feb 25 10:54:11 crc kubenswrapper[4725]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Feb 25 10:54:11 crc kubenswrapper[4725]: # As the secret mount is optional we must wait for the files to be present. Feb 25 10:54:11 crc kubenswrapper[4725]: # The service is created in monitor.yaml and this is created in sdn.yaml. Feb 25 10:54:11 crc kubenswrapper[4725]: TS=$(date +%s) Feb 25 10:54:11 crc kubenswrapper[4725]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Feb 25 10:54:11 crc kubenswrapper[4725]: HAS_LOGGED_INFO=0 Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: log_missing_certs(){ Feb 25 10:54:11 crc kubenswrapper[4725]: CUR_TS=$(date +%s) Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Feb 25 10:54:11 crc kubenswrapper[4725]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Feb 25 10:54:11 crc kubenswrapper[4725]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Feb 25 10:54:11 crc kubenswrapper[4725]: HAS_LOGGED_INFO=1 Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: } Feb 25 10:54:11 crc kubenswrapper[4725]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Feb 25 10:54:11 crc kubenswrapper[4725]: log_missing_certs Feb 25 10:54:11 crc kubenswrapper[4725]: sleep 5 Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/kube-rbac-proxy \ Feb 25 10:54:11 crc kubenswrapper[4725]: --logtostderr \ Feb 25 10:54:11 crc kubenswrapper[4725]: --secure-listen-address=:9108 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --upstream=http://127.0.0.1:29108/ \ Feb 25 10:54:11 crc kubenswrapper[4725]: --tls-private-key-file=${TLS_PK} \ Feb 25 10:54:11 crc kubenswrapper[4725]: --tls-cert-file=${TLS_CERT} Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7ngx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-rtvsj_openshift-ovn-kubernetes(8f769618-965f-430a-8f67-e1ef4d94a063): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.571721 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 25 10:54:11 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: source /etc/kubernetes/apiserver-url.env Feb 25 10:54:11 crc kubenswrapper[4725]: else Feb 25 10:54:11 crc kubenswrapper[4725]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 25 10:54:11 crc kubenswrapper[4725]: exit 1 Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.571923 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Feb 25 10:54:11 crc kubenswrapper[4725]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Feb 25 10:54:11 crc kubenswrapper[4725]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwml6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d6b9f_openshift-multus(7fb276f6-5e43-4b04-a290-42bfdc3b1125): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.572035 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: source "/env/_master" Feb 25 10:54:11 crc kubenswrapper[4725]: set +o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 25 10:54:11 crc kubenswrapper[4725]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 25 10:54:11 crc kubenswrapper[4725]: ho_enable="--enable-hybrid-overlay" Feb 25 10:54:11 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 25 10:54:11 crc kubenswrapper[4725]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 25 10:54:11 crc kubenswrapper[4725]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --webhook-host=127.0.0.1 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --webhook-port=9743 \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${ho_enable} \ Feb 25 10:54:11 crc kubenswrapper[4725]: --enable-interconnect \ Feb 25 10:54:11 crc kubenswrapper[4725]: --disable-approver \ Feb 25 10:54:11 crc kubenswrapper[4725]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --wait-for-kubernetes-api=200s \ Feb 25 10:54:11 crc kubenswrapper[4725]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.572618 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.572933 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.573015 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d6b9f" podUID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.576237 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: source "/env/_master" Feb 25 10:54:11 crc kubenswrapper[4725]: set +o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 25 10:54:11 crc kubenswrapper[4725]: --disable-webhook \ Feb 25 10:54:11 crc kubenswrapper[4725]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.576313 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: source "/env/_master" Feb 25 10:54:11 crc kubenswrapper[4725]: set +o allexport Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v4_join_subnet_opt= Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v6_join_subnet_opt= Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v4_transit_switch_subnet_opt= Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v6_transit_switch_subnet_opt= Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: dns_name_resolver_enabled_flag= Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "false" == "true" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: persistent_ips_enabled_flag= Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "true" == "true" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: persistent_ips_enabled_flag="--enable-persistent-ips" Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: # This is needed so that converting clusters from GA to TP Feb 25 10:54:11 crc kubenswrapper[4725]: # will rollout control plane pods as well Feb 25 10:54:11 crc kubenswrapper[4725]: network_segmentation_enabled_flag= Feb 25 10:54:11 crc kubenswrapper[4725]: multi_network_enabled_flag= Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "true" == "true" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: multi_network_enabled_flag="--enable-multi-network" Feb 25 10:54:11 crc kubenswrapper[4725]: network_segmentation_enabled_flag="--enable-network-segmentation" Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Feb 25 10:54:11 crc kubenswrapper[4725]: exec /usr/bin/ovnkube \ Feb 25 10:54:11 crc kubenswrapper[4725]: --enable-interconnect \ Feb 25 10:54:11 crc kubenswrapper[4725]: --init-cluster-manager "${K8S_NODE}" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --config-file=/run/ovnkube-config/ovnkube.conf \ Feb 25 10:54:11 crc kubenswrapper[4725]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --metrics-bind-address "127.0.0.1:29108" \ Feb 25 10:54:11 crc kubenswrapper[4725]: --metrics-enable-pprof \ Feb 25 10:54:11 crc kubenswrapper[4725]: --metrics-enable-config-duration \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${ovn_v4_join_subnet_opt} \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${ovn_v6_join_subnet_opt} \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${ovn_v4_transit_switch_subnet_opt} \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${ovn_v6_transit_switch_subnet_opt} \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${dns_name_resolver_enabled_flag} \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${persistent_ips_enabled_flag} \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${multi_network_enabled_flag} \ Feb 25 10:54:11 crc kubenswrapper[4725]: ${network_segmentation_enabled_flag} Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7ngx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-rtvsj_openshift-ovn-kubernetes(8f769618-965f-430a-8f67-e1ef4d94a063): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.577344 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.577414 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" podUID="8f769618-965f-430a-8f67-e1ef4d94a063" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.578283 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mbpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.579533 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.580295 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mbpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.582472 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.581964 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.590015 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.597487 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.597528 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.597538 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.597555 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.597566 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.604149 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r45dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-9mhzp_openshift-multus(0c8d8877-1961-407f-b4a7-66e55321a6eb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.605287 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" podUID="0c8d8877-1961-407f-b4a7-66e55321a6eb" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.611530 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.620784 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9989l" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.623375 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.632295 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: W0225 10:54:11.633216 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de69f49_3e33_4721_9fee_ad2fc45b16bf.slice/crio-a4d065f33b3496d81cbb3a125b82b3c099bab2bb2780619c4807a93cc7395b93 WatchSource:0}: Error finding container a4d065f33b3496d81cbb3a125b82b3c099bab2bb2780619c4807a93cc7395b93: Status 404 returned error can't find the container with id a4d065f33b3496d81cbb3a125b82b3c099bab2bb2780619c4807a93cc7395b93 Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.635916 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Feb 25 10:54:11 crc kubenswrapper[4725]: set -uo pipefail Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Feb 25 10:54:11 crc kubenswrapper[4725]: HOSTS_FILE="/etc/hosts" Feb 25 10:54:11 crc kubenswrapper[4725]: TEMP_FILE="/etc/hosts.tmp" Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: IFS=', ' read -r -a services <<< "${SERVICES}" Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: # Make a temporary file with the old hosts file's attributes. Feb 25 10:54:11 crc kubenswrapper[4725]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Feb 25 10:54:11 crc kubenswrapper[4725]: echo "Failed to preserve hosts file. Exiting." Feb 25 10:54:11 crc kubenswrapper[4725]: exit 1 Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: while true; do Feb 25 10:54:11 crc kubenswrapper[4725]: declare -A svc_ips Feb 25 10:54:11 crc kubenswrapper[4725]: for svc in "${services[@]}"; do Feb 25 10:54:11 crc kubenswrapper[4725]: # Fetch service IP from cluster dns if present. We make several tries Feb 25 10:54:11 crc kubenswrapper[4725]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Feb 25 10:54:11 crc kubenswrapper[4725]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Feb 25 10:54:11 crc kubenswrapper[4725]: # support UDP loadbalancers and require reaching DNS through TCP. Feb 25 10:54:11 crc kubenswrapper[4725]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Feb 25 10:54:11 crc kubenswrapper[4725]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Feb 25 10:54:11 crc kubenswrapper[4725]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Feb 25 10:54:11 crc kubenswrapper[4725]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Feb 25 10:54:11 crc kubenswrapper[4725]: for i in ${!cmds[*]} Feb 25 10:54:11 crc kubenswrapper[4725]: do Feb 25 10:54:11 crc kubenswrapper[4725]: ips=($(eval "${cmds[i]}")) Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: svc_ips["${svc}"]="${ips[@]}" Feb 25 10:54:11 crc kubenswrapper[4725]: break Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: # Update /etc/hosts only if we get valid service IPs Feb 25 10:54:11 crc kubenswrapper[4725]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Feb 25 10:54:11 crc kubenswrapper[4725]: # Stale entries could exist in /etc/hosts if the service is deleted Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ -n "${svc_ips[*]-}" ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Feb 25 10:54:11 crc kubenswrapper[4725]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Feb 25 10:54:11 crc kubenswrapper[4725]: # Only continue rebuilding the hosts entries if its original content is preserved Feb 25 10:54:11 crc kubenswrapper[4725]: sleep 60 & wait Feb 25 10:54:11 crc kubenswrapper[4725]: continue Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: # Append resolver entries for services Feb 25 10:54:11 crc kubenswrapper[4725]: rc=0 Feb 25 10:54:11 crc kubenswrapper[4725]: for svc in "${!svc_ips[@]}"; do Feb 25 10:54:11 crc kubenswrapper[4725]: for ip in ${svc_ips[${svc}]}; do Feb 25 10:54:11 crc kubenswrapper[4725]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: if [[ $rc -ne 0 ]]; then Feb 25 10:54:11 crc kubenswrapper[4725]: sleep 60 & wait Feb 25 10:54:11 crc kubenswrapper[4725]: continue Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: Feb 25 10:54:11 crc kubenswrapper[4725]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Feb 25 10:54:11 crc kubenswrapper[4725]: # Replace /etc/hosts with our modified version if needed Feb 25 10:54:11 crc kubenswrapper[4725]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Feb 25 10:54:11 crc kubenswrapper[4725]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: sleep 60 & wait Feb 25 10:54:11 crc kubenswrapper[4725]: unset svc_ips Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rnp2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9989l_openshift-dns(9de69f49-3e33-4721-9fee-ad2fc45b16bf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.637091 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9989l" podUID="9de69f49-3e33-4721-9fee-ad2fc45b16bf" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.638351 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8zw9d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.643650 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.646903 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.654682 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.658328 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Feb 25 10:54:11 crc kubenswrapper[4725]: while [ true ]; Feb 25 10:54:11 crc kubenswrapper[4725]: do Feb 25 10:54:11 crc kubenswrapper[4725]: for f in $(ls /tmp/serviceca); do Feb 25 10:54:11 crc kubenswrapper[4725]: echo $f Feb 25 10:54:11 crc kubenswrapper[4725]: ca_file_path="/tmp/serviceca/${f}" Feb 25 10:54:11 crc kubenswrapper[4725]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Feb 25 10:54:11 crc kubenswrapper[4725]: reg_dir_path="/etc/docker/certs.d/${f}" Feb 25 10:54:11 crc kubenswrapper[4725]: if [ -e "${reg_dir_path}" ]; then Feb 25 10:54:11 crc kubenswrapper[4725]: cp -u $ca_file_path $reg_dir_path/ca.crt Feb 25 10:54:11 crc kubenswrapper[4725]: else Feb 25 10:54:11 crc kubenswrapper[4725]: mkdir $reg_dir_path Feb 25 10:54:11 crc kubenswrapper[4725]: cp $ca_file_path $reg_dir_path/ca.crt Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: for d in $(ls /etc/docker/certs.d); do Feb 25 10:54:11 crc kubenswrapper[4725]: echo $d Feb 25 10:54:11 crc kubenswrapper[4725]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Feb 25 10:54:11 crc kubenswrapper[4725]: reg_conf_path="/tmp/serviceca/${dp}" Feb 25 10:54:11 crc kubenswrapper[4725]: if [ ! -e "${reg_conf_path}" ]; then Feb 25 10:54:11 crc kubenswrapper[4725]: rm -rf /etc/docker/certs.d/$d Feb 25 10:54:11 crc kubenswrapper[4725]: fi Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: sleep 60 & wait ${!} Feb 25 10:54:11 crc kubenswrapper[4725]: done Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dvjr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-8zw9d_openshift-image-registry(b4a262bc-bc77-471f-91d7-58fb221fa404): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.659421 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-8zw9d" podUID="b4a262bc-bc77-471f-91d7-58fb221fa404" Feb 25 10:54:11 crc kubenswrapper[4725]: W0225 10:54:11.661456 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-6e754f79b582e88daaa8265d5628448ee5846cd084b944e9b061e538e4054258 WatchSource:0}: Error finding container 6e754f79b582e88daaa8265d5628448ee5846cd084b944e9b061e538e4054258: Status 404 returned error can't find the container with id 6e754f79b582e88daaa8265d5628448ee5846cd084b944e9b061e538e4054258 Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.664705 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:11 crc kubenswrapper[4725]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Feb 25 10:54:11 crc kubenswrapper[4725]: apiVersion: v1 Feb 25 10:54:11 crc kubenswrapper[4725]: clusters: Feb 25 10:54:11 crc kubenswrapper[4725]: - cluster: Feb 25 10:54:11 crc kubenswrapper[4725]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Feb 25 10:54:11 crc kubenswrapper[4725]: server: https://api-int.crc.testing:6443 Feb 25 10:54:11 crc kubenswrapper[4725]: name: default-cluster Feb 25 10:54:11 crc kubenswrapper[4725]: contexts: Feb 25 10:54:11 crc kubenswrapper[4725]: - context: Feb 25 10:54:11 crc kubenswrapper[4725]: cluster: default-cluster Feb 25 10:54:11 crc kubenswrapper[4725]: namespace: default Feb 25 10:54:11 crc kubenswrapper[4725]: user: default-auth Feb 25 10:54:11 crc kubenswrapper[4725]: name: default-context Feb 25 10:54:11 crc kubenswrapper[4725]: current-context: default-context Feb 25 10:54:11 crc kubenswrapper[4725]: kind: Config Feb 25 10:54:11 crc kubenswrapper[4725]: preferences: {} Feb 25 10:54:11 crc kubenswrapper[4725]: users: Feb 25 10:54:11 crc kubenswrapper[4725]: - name: default-auth Feb 25 10:54:11 crc kubenswrapper[4725]: user: Feb 25 10:54:11 crc kubenswrapper[4725]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Feb 25 10:54:11 crc kubenswrapper[4725]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Feb 25 10:54:11 crc kubenswrapper[4725]: EOF Feb 25 10:54:11 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hct4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:11 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.664718 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.667201 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.676716 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.687520 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.696990 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.700221 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.700246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.700254 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.700270 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.700281 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.709593 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.717947 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.728694 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.739461 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.750716 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.763897 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.773542 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.790947 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.796877 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.797063 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:54:12.79703132 +0000 UTC m=+78.295613375 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.797329 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.797552 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.797574 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.797952 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:12.797926954 +0000 UTC m=+78.296509009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.797701 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.798478 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:12.798456878 +0000 UTC m=+78.297038943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.801492 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.803310 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.803408 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.803431 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.803460 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.803485 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.813252 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.824381 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.835389 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.861914 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.875279 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.889036 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.899280 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.899376 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.899414 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.899619 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.899657 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.899679 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.899748 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:12.899723894 +0000 UTC m=+78.398305959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.900536 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.900637 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:12.900617288 +0000 UTC m=+78.399199343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.900708 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.900758 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.900778 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: E0225 10:54:11.900889 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:12.900823103 +0000 UTC m=+78.399405168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.908194 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.908242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.908264 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.908293 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.908315 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:11Z","lastTransitionTime":"2026-02-25T10:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:11 crc kubenswrapper[4725]: I0225 10:54:11.918666 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.011040 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.011087 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.011103 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.011127 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.011145 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.114368 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.114415 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.114432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.114454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.114475 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.217316 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.217370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.217393 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.217421 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.217442 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.321425 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.321489 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.321513 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.321585 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.321614 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.425097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.425162 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.425179 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.425205 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.425222 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.528118 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.528186 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.528210 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.528242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.528272 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.570067 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"6e754f79b582e88daaa8265d5628448ee5846cd084b944e9b061e538e4054258"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.571454 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8zw9d" event={"ID":"b4a262bc-bc77-471f-91d7-58fb221fa404","Type":"ContainerStarted","Data":"3ced6efc36b3d049b5c140ac0b26638ad4e1b91ac71810ccda0349dea1a87960"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.572796 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" event={"ID":"8f769618-965f-430a-8f67-e1ef4d94a063","Type":"ContainerStarted","Data":"8bdba2bccee6f5b4a8d3a2d48dbc75560ea2d6ae4657e643f38ddf6a74c3dbb9"} Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.573885 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:12 crc kubenswrapper[4725]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Feb 25 10:54:12 crc kubenswrapper[4725]: while [ true ]; Feb 25 10:54:12 crc kubenswrapper[4725]: do Feb 25 10:54:12 crc kubenswrapper[4725]: for f in $(ls /tmp/serviceca); do Feb 25 10:54:12 crc kubenswrapper[4725]: echo $f Feb 25 10:54:12 crc kubenswrapper[4725]: ca_file_path="/tmp/serviceca/${f}" Feb 25 10:54:12 crc kubenswrapper[4725]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Feb 25 10:54:12 crc kubenswrapper[4725]: reg_dir_path="/etc/docker/certs.d/${f}" Feb 25 10:54:12 crc kubenswrapper[4725]: if [ -e "${reg_dir_path}" ]; then Feb 25 10:54:12 crc kubenswrapper[4725]: cp -u $ca_file_path $reg_dir_path/ca.crt Feb 25 10:54:12 crc kubenswrapper[4725]: else Feb 25 10:54:12 crc kubenswrapper[4725]: mkdir $reg_dir_path Feb 25 10:54:12 crc kubenswrapper[4725]: cp $ca_file_path $reg_dir_path/ca.crt Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: for d in $(ls /etc/docker/certs.d); do Feb 25 10:54:12 crc kubenswrapper[4725]: echo $d Feb 25 10:54:12 crc kubenswrapper[4725]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Feb 25 10:54:12 crc kubenswrapper[4725]: reg_conf_path="/tmp/serviceca/${dp}" Feb 25 10:54:12 crc kubenswrapper[4725]: if [ ! -e "${reg_conf_path}" ]; then Feb 25 10:54:12 crc kubenswrapper[4725]: rm -rf /etc/docker/certs.d/$d Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: sleep 60 & wait ${!} Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dvjr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-8zw9d_openshift-image-registry(b4a262bc-bc77-471f-91d7-58fb221fa404): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:12 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.574105 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:12 crc kubenswrapper[4725]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Feb 25 10:54:12 crc kubenswrapper[4725]: set -euo pipefail Feb 25 10:54:12 crc kubenswrapper[4725]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Feb 25 10:54:12 crc kubenswrapper[4725]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Feb 25 10:54:12 crc kubenswrapper[4725]: # As the secret mount is optional we must wait for the files to be present. Feb 25 10:54:12 crc kubenswrapper[4725]: # The service is created in monitor.yaml and this is created in sdn.yaml. Feb 25 10:54:12 crc kubenswrapper[4725]: TS=$(date +%s) Feb 25 10:54:12 crc kubenswrapper[4725]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Feb 25 10:54:12 crc kubenswrapper[4725]: HAS_LOGGED_INFO=0 Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: log_missing_certs(){ Feb 25 10:54:12 crc kubenswrapper[4725]: CUR_TS=$(date +%s) Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Feb 25 10:54:12 crc kubenswrapper[4725]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Feb 25 10:54:12 crc kubenswrapper[4725]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Feb 25 10:54:12 crc kubenswrapper[4725]: HAS_LOGGED_INFO=1 Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: } Feb 25 10:54:12 crc kubenswrapper[4725]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Feb 25 10:54:12 crc kubenswrapper[4725]: log_missing_certs Feb 25 10:54:12 crc kubenswrapper[4725]: sleep 5 Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Feb 25 10:54:12 crc kubenswrapper[4725]: exec /usr/bin/kube-rbac-proxy \ Feb 25 10:54:12 crc kubenswrapper[4725]: --logtostderr \ Feb 25 10:54:12 crc kubenswrapper[4725]: --secure-listen-address=:9108 \ Feb 25 10:54:12 crc kubenswrapper[4725]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Feb 25 10:54:12 crc kubenswrapper[4725]: --upstream=http://127.0.0.1:29108/ \ Feb 25 10:54:12 crc kubenswrapper[4725]: --tls-private-key-file=${TLS_PK} \ Feb 25 10:54:12 crc kubenswrapper[4725]: --tls-cert-file=${TLS_CERT} Feb 25 10:54:12 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7ngx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-rtvsj_openshift-ovn-kubernetes(8f769618-965f-430a-8f67-e1ef4d94a063): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:12 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.574270 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerStarted","Data":"145aae69f4701de751c5b7e24ad1f3e338ca319950789fd6bab0db59c48ead5a"} Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.574322 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:12 crc kubenswrapper[4725]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Feb 25 10:54:12 crc kubenswrapper[4725]: apiVersion: v1 Feb 25 10:54:12 crc kubenswrapper[4725]: clusters: Feb 25 10:54:12 crc kubenswrapper[4725]: - cluster: Feb 25 10:54:12 crc kubenswrapper[4725]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Feb 25 10:54:12 crc kubenswrapper[4725]: server: https://api-int.crc.testing:6443 Feb 25 10:54:12 crc kubenswrapper[4725]: name: default-cluster Feb 25 10:54:12 crc kubenswrapper[4725]: contexts: Feb 25 10:54:12 crc kubenswrapper[4725]: - context: Feb 25 10:54:12 crc kubenswrapper[4725]: cluster: default-cluster Feb 25 10:54:12 crc kubenswrapper[4725]: namespace: default Feb 25 10:54:12 crc kubenswrapper[4725]: user: default-auth Feb 25 10:54:12 crc kubenswrapper[4725]: name: default-context Feb 25 10:54:12 crc kubenswrapper[4725]: current-context: default-context Feb 25 10:54:12 crc kubenswrapper[4725]: kind: Config Feb 25 10:54:12 crc kubenswrapper[4725]: preferences: {} Feb 25 10:54:12 crc kubenswrapper[4725]: users: Feb 25 10:54:12 crc kubenswrapper[4725]: - name: default-auth Feb 25 10:54:12 crc kubenswrapper[4725]: user: Feb 25 10:54:12 crc kubenswrapper[4725]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Feb 25 10:54:12 crc kubenswrapper[4725]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Feb 25 10:54:12 crc kubenswrapper[4725]: EOF Feb 25 10:54:12 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hct4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:12 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.575267 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-8zw9d" podUID="b4a262bc-bc77-471f-91d7-58fb221fa404" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.576045 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.576365 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:12 crc kubenswrapper[4725]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Feb 25 10:54:12 crc kubenswrapper[4725]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Feb 25 10:54:12 crc kubenswrapper[4725]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwml6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d6b9f_openshift-multus(7fb276f6-5e43-4b04-a290-42bfdc3b1125): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:12 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.577269 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"9900e83c8c361671dbf6bd6d131f4be6e220eda43c9c83e4388d8850b8e3a7f0"} Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.577538 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:12 crc kubenswrapper[4725]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: set -o allexport Feb 25 10:54:12 crc kubenswrapper[4725]: source "/env/_master" Feb 25 10:54:12 crc kubenswrapper[4725]: set +o allexport Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v4_join_subnet_opt= Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v6_join_subnet_opt= Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v4_transit_switch_subnet_opt= Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v6_transit_switch_subnet_opt= Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "" != "" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: dns_name_resolver_enabled_flag= Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "false" == "true" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: persistent_ips_enabled_flag= Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "true" == "true" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: persistent_ips_enabled_flag="--enable-persistent-ips" Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: # This is needed so that converting clusters from GA to TP Feb 25 10:54:12 crc kubenswrapper[4725]: # will rollout control plane pods as well Feb 25 10:54:12 crc kubenswrapper[4725]: network_segmentation_enabled_flag= Feb 25 10:54:12 crc kubenswrapper[4725]: multi_network_enabled_flag= Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "true" == "true" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: multi_network_enabled_flag="--enable-multi-network" Feb 25 10:54:12 crc kubenswrapper[4725]: network_segmentation_enabled_flag="--enable-network-segmentation" Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Feb 25 10:54:12 crc kubenswrapper[4725]: exec /usr/bin/ovnkube \ Feb 25 10:54:12 crc kubenswrapper[4725]: --enable-interconnect \ Feb 25 10:54:12 crc kubenswrapper[4725]: --init-cluster-manager "${K8S_NODE}" \ Feb 25 10:54:12 crc kubenswrapper[4725]: --config-file=/run/ovnkube-config/ovnkube.conf \ Feb 25 10:54:12 crc kubenswrapper[4725]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Feb 25 10:54:12 crc kubenswrapper[4725]: --metrics-bind-address "127.0.0.1:29108" \ Feb 25 10:54:12 crc kubenswrapper[4725]: --metrics-enable-pprof \ Feb 25 10:54:12 crc kubenswrapper[4725]: --metrics-enable-config-duration \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${ovn_v4_join_subnet_opt} \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${ovn_v6_join_subnet_opt} \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${ovn_v4_transit_switch_subnet_opt} \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${ovn_v6_transit_switch_subnet_opt} \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${dns_name_resolver_enabled_flag} \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${persistent_ips_enabled_flag} \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${multi_network_enabled_flag} \ Feb 25 10:54:12 crc kubenswrapper[4725]: ${network_segmentation_enabled_flag} Feb 25 10:54:12 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7ngx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-rtvsj_openshift-ovn-kubernetes(8f769618-965f-430a-8f67-e1ef4d94a063): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:12 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.577615 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d6b9f" podUID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.578663 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" podUID="8f769618-965f-430a-8f67-e1ef4d94a063" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.578856 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mbpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.579171 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9989l" event={"ID":"9de69f49-3e33-4721-9fee-ad2fc45b16bf","Type":"ContainerStarted","Data":"a4d065f33b3496d81cbb3a125b82b3c099bab2bb2780619c4807a93cc7395b93"} Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.580676 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:54:12 crc kubenswrapper[4725]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Feb 25 10:54:12 crc kubenswrapper[4725]: set -uo pipefail Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Feb 25 10:54:12 crc kubenswrapper[4725]: HOSTS_FILE="/etc/hosts" Feb 25 10:54:12 crc kubenswrapper[4725]: TEMP_FILE="/etc/hosts.tmp" Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: IFS=', ' read -r -a services <<< "${SERVICES}" Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: # Make a temporary file with the old hosts file's attributes. Feb 25 10:54:12 crc kubenswrapper[4725]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Feb 25 10:54:12 crc kubenswrapper[4725]: echo "Failed to preserve hosts file. Exiting." Feb 25 10:54:12 crc kubenswrapper[4725]: exit 1 Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: while true; do Feb 25 10:54:12 crc kubenswrapper[4725]: declare -A svc_ips Feb 25 10:54:12 crc kubenswrapper[4725]: for svc in "${services[@]}"; do Feb 25 10:54:12 crc kubenswrapper[4725]: # Fetch service IP from cluster dns if present. We make several tries Feb 25 10:54:12 crc kubenswrapper[4725]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Feb 25 10:54:12 crc kubenswrapper[4725]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Feb 25 10:54:12 crc kubenswrapper[4725]: # support UDP loadbalancers and require reaching DNS through TCP. Feb 25 10:54:12 crc kubenswrapper[4725]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Feb 25 10:54:12 crc kubenswrapper[4725]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Feb 25 10:54:12 crc kubenswrapper[4725]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Feb 25 10:54:12 crc kubenswrapper[4725]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Feb 25 10:54:12 crc kubenswrapper[4725]: for i in ${!cmds[*]} Feb 25 10:54:12 crc kubenswrapper[4725]: do Feb 25 10:54:12 crc kubenswrapper[4725]: ips=($(eval "${cmds[i]}")) Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: svc_ips["${svc}"]="${ips[@]}" Feb 25 10:54:12 crc kubenswrapper[4725]: break Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: # Update /etc/hosts only if we get valid service IPs Feb 25 10:54:12 crc kubenswrapper[4725]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Feb 25 10:54:12 crc kubenswrapper[4725]: # Stale entries could exist in /etc/hosts if the service is deleted Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ -n "${svc_ips[*]-}" ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Feb 25 10:54:12 crc kubenswrapper[4725]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Feb 25 10:54:12 crc kubenswrapper[4725]: # Only continue rebuilding the hosts entries if its original content is preserved Feb 25 10:54:12 crc kubenswrapper[4725]: sleep 60 & wait Feb 25 10:54:12 crc kubenswrapper[4725]: continue Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: # Append resolver entries for services Feb 25 10:54:12 crc kubenswrapper[4725]: rc=0 Feb 25 10:54:12 crc kubenswrapper[4725]: for svc in "${!svc_ips[@]}"; do Feb 25 10:54:12 crc kubenswrapper[4725]: for ip in ${svc_ips[${svc}]}; do Feb 25 10:54:12 crc kubenswrapper[4725]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: if [[ $rc -ne 0 ]]; then Feb 25 10:54:12 crc kubenswrapper[4725]: sleep 60 & wait Feb 25 10:54:12 crc kubenswrapper[4725]: continue Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: Feb 25 10:54:12 crc kubenswrapper[4725]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Feb 25 10:54:12 crc kubenswrapper[4725]: # Replace /etc/hosts with our modified version if needed Feb 25 10:54:12 crc kubenswrapper[4725]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Feb 25 10:54:12 crc kubenswrapper[4725]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Feb 25 10:54:12 crc kubenswrapper[4725]: fi Feb 25 10:54:12 crc kubenswrapper[4725]: sleep 60 & wait Feb 25 10:54:12 crc kubenswrapper[4725]: unset svc_ips Feb 25 10:54:12 crc kubenswrapper[4725]: done Feb 25 10:54:12 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rnp2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9989l_openshift-dns(9de69f49-3e33-4721-9fee-ad2fc45b16bf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 25 10:54:12 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.581987 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerStarted","Data":"53de2b2c6f1ce1983ddf9d22ccfdefdf9ab8e7a733c5f7f547fc89168a541589"} Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.581995 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9989l" podUID="9de69f49-3e33-4721-9fee-ad2fc45b16bf" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.582452 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mbpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.583753 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.584254 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r45dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-9mhzp_openshift-multus(0c8d8877-1961-407f-b4a7-66e55321a6eb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.586091 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" podUID="0c8d8877-1961-407f-b4a7-66e55321a6eb" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.589300 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.604257 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.615035 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.626530 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.632212 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.632262 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.632280 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.632303 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.632345 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.637505 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.648620 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.662481 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.675280 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.694148 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.701733 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.714736 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.726719 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.735222 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.736262 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.736307 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.736316 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.736339 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.736349 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.758609 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.769451 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.779888 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.789337 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.797109 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.805669 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.812710 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.812901 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:54:14.812873329 +0000 UTC m=+80.311455374 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.812952 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.813019 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.813081 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.813130 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:14.813120886 +0000 UTC m=+80.311702911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.813132 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.813163 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:14.813157587 +0000 UTC m=+80.311739602 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.813791 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.823258 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.834091 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.839554 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.839602 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.839615 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.839634 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.839649 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.846174 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.875075 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.914194 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.914244 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.914279 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.914395 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.914410 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.914420 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.914463 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:14.914451164 +0000 UTC m=+80.413033189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.914773 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.914818 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:14.914808823 +0000 UTC m=+80.413390848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.915466 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.915533 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.915585 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:12 crc kubenswrapper[4725]: E0225 10:54:12.915669 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:14.915660106 +0000 UTC m=+80.414242131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.916577 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.942803 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.942888 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.942904 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.942926 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.942944 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:12Z","lastTransitionTime":"2026-02-25T10:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.960649 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:12 crc kubenswrapper[4725]: I0225 10:54:12.998032 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.046952 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.047392 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.047499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.047610 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.047703 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.053061 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.151167 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.151244 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.151270 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.151295 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.151313 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.223434 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.223699 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.223774 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.223938 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:13 crc kubenswrapper[4725]: E0225 10:54:13.223812 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:13 crc kubenswrapper[4725]: E0225 10:54:13.223783 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:13 crc kubenswrapper[4725]: E0225 10:54:13.224015 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:13 crc kubenswrapper[4725]: E0225 10:54:13.224541 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.232711 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.235213 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.239090 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.240080 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.242069 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.242891 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.244451 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.245082 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.246231 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.246810 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.247401 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.248806 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.249355 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.250313 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.250926 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.251899 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.252474 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.253065 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.254467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.254520 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.254545 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.254568 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.254584 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.255427 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.257378 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.258956 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.260447 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.261002 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.262307 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.262808 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.263993 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.264744 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.265778 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.266493 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.267442 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.268066 4725 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.268198 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.269950 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.271072 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.271532 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.273270 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.274393 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.275089 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.276160 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.276825 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.277658 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.278305 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.279317 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.280033 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.280882 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.281421 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.282453 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.283513 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.284773 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.285477 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.286704 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.287438 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.288270 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.289509 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.361604 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.361682 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.361708 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.361758 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.361786 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.465396 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.465480 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.465504 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.465536 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.465559 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.568902 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.568962 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.568983 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.569012 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.569034 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.672403 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.672466 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.672488 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.672510 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.672527 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.775299 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.775359 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.775380 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.775407 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.775427 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.878915 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.878999 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.879023 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.879053 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.879076 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.981169 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.981202 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.981230 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.981247 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:13 crc kubenswrapper[4725]: I0225 10:54:13.981257 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:13Z","lastTransitionTime":"2026-02-25T10:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.083073 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.083280 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.083312 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.083341 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.083363 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.186551 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.186590 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.186613 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.186637 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.186652 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.289454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.289496 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.289507 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.289522 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.289531 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.392070 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.392127 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.392205 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.392227 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.392247 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.495666 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.495754 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.495773 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.495800 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.495818 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.598705 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.598760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.598772 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.598797 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.598812 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.702740 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.702795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.702815 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.702878 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.702897 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.713078 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.713139 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.713162 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.713187 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.713204 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.728915 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.735017 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.735098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.735118 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.735142 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.735161 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.750772 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.757606 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.757716 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.757739 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.757824 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.757906 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.775781 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.781992 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.782097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.782117 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.782147 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.782176 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.798554 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.804539 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.804772 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.804969 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.805265 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.805435 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.822317 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.822876 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.825069 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.825144 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.825164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.825193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.825212 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.836588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.836757 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.836811 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.836989 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.837052 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:18.837037633 +0000 UTC m=+84.335619648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.837093 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:54:18.837060214 +0000 UTC m=+84.335642269 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.837117 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.837225 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:18.837190607 +0000 UTC m=+84.335772662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.928334 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.928439 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.928467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.928508 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.928624 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:14Z","lastTransitionTime":"2026-02-25T10:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.938005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.938121 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:14 crc kubenswrapper[4725]: I0225 10:54:14.938163 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.938347 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.938384 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.938437 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.938524 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:18.938502195 +0000 UTC m=+84.437084260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.938608 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.938746 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:18.938721521 +0000 UTC m=+84.437303596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.938937 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.939015 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.939090 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:14 crc kubenswrapper[4725]: E0225 10:54:14.939204 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:18.939191063 +0000 UTC m=+84.437773168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.031820 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.031888 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.031901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.031920 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.031933 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.134752 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.134798 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.134809 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.134886 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.134901 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.224173 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.224279 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.224503 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.224562 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:15 crc kubenswrapper[4725]: E0225 10:54:15.224569 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:15 crc kubenswrapper[4725]: E0225 10:54:15.224667 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:15 crc kubenswrapper[4725]: E0225 10:54:15.225738 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:15 crc kubenswrapper[4725]: E0225 10:54:15.226052 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.234576 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.237152 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.237179 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.237189 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.237201 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.237214 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.248285 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.262949 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.273608 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.284172 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.298169 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.311209 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.326400 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.340639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.340681 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.340691 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.340707 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.340717 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.343750 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.353586 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.362339 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.371851 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.379613 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.395307 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.443634 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.443679 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.443689 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.443713 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.443727 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.546800 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.546879 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.546890 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.546912 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.546938 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.649425 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.649470 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.649534 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.649563 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.649574 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.753420 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.753519 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.753622 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.753666 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.753692 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.857286 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.857910 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.857933 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.857961 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.857983 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.961533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.961577 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.961592 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.961611 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:15 crc kubenswrapper[4725]: I0225 10:54:15.961626 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:15Z","lastTransitionTime":"2026-02-25T10:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.065785 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.065933 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.065967 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.065994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.066014 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.170770 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.170882 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.170907 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.170941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.170965 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.274983 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.275225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.275247 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.275319 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.275342 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.379328 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.379403 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.379421 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.379450 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.379471 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.482763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.482812 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.482821 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.482862 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.482873 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.585972 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.586415 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.586501 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.586537 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.586555 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.690188 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.690255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.690267 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.690287 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.690301 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.793964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.794087 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.794112 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.794145 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.794168 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.897402 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.897489 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.897505 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.897528 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:16 crc kubenswrapper[4725]: I0225 10:54:16.897541 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:16Z","lastTransitionTime":"2026-02-25T10:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.001680 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.001756 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.001780 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.001809 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.001862 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.105561 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.105620 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.105631 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.105654 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.105669 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.208777 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.208868 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.208887 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.208910 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.208926 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.223625 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.223720 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.223755 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.223746 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:17 crc kubenswrapper[4725]: E0225 10:54:17.223940 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:17 crc kubenswrapper[4725]: E0225 10:54:17.224122 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:17 crc kubenswrapper[4725]: E0225 10:54:17.224290 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:17 crc kubenswrapper[4725]: E0225 10:54:17.224424 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.312521 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.312574 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.312593 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.312618 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.312635 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.416877 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.416936 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.416948 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.416970 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.416984 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.520265 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.521016 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.521107 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.521208 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.521310 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.624773 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.624857 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.624873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.624899 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.624919 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.728411 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.728466 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.728480 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.728504 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.728520 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.831111 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.831603 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.831679 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.831774 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.831881 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.935025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.935081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.935095 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.935115 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:17 crc kubenswrapper[4725]: I0225 10:54:17.935131 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:17Z","lastTransitionTime":"2026-02-25T10:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.038456 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.038934 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.039093 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.039248 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.039576 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.142583 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.142650 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.142662 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.142680 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.142691 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.241226 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.241267 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.241524 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.245808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.245883 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.245897 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.245935 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.245951 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.350022 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.350095 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.350108 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.350150 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.350168 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.453782 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.453846 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.453861 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.453885 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.453898 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.557483 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.557591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.557617 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.557692 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.557718 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.599157 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.599592 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.660298 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.660345 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.660357 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.660374 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.660388 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.764781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.764933 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.764964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.764998 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.765025 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.869082 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.869152 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.869196 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.869225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.869242 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.891371 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.891591 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:54:26.891545281 +0000 UTC m=+92.390127346 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.891698 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.891780 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.892049 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.892155 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:26.892132596 +0000 UTC m=+92.390714651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.892157 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.892334 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:26.89229774 +0000 UTC m=+92.390879805 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.974399 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.974467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.974485 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.974523 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.974543 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:18Z","lastTransitionTime":"2026-02-25T10:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.992658 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.992754 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:18 crc kubenswrapper[4725]: I0225 10:54:18.992785 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993046 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993039 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993064 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993193 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:26.993157496 +0000 UTC m=+92.491739561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993208 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993249 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993330 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:26.99329992 +0000 UTC m=+92.491882175 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993079 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993385 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:18 crc kubenswrapper[4725]: E0225 10:54:18.993472 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:26.993451244 +0000 UTC m=+92.492033519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.077912 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.077966 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.077975 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.077994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.078009 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.181366 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.181605 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.181630 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.181697 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.181732 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.223373 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.223372 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.223362 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.223560 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:19 crc kubenswrapper[4725]: E0225 10:54:19.223725 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:19 crc kubenswrapper[4725]: E0225 10:54:19.223987 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:19 crc kubenswrapper[4725]: E0225 10:54:19.224185 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:19 crc kubenswrapper[4725]: E0225 10:54:19.224385 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.285737 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.285802 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.285858 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.285896 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.285920 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.390503 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.390572 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.390590 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.390618 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.390635 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.495041 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.495107 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.495120 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.495172 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.495185 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.599470 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.599533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.599552 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.599578 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.599595 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.702935 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.702991 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.703002 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.703025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.703038 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.806582 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.806671 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.806691 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.806725 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.806747 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.910065 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.910118 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.910135 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.910161 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:19 crc kubenswrapper[4725]: I0225 10:54:19.910181 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:19Z","lastTransitionTime":"2026-02-25T10:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.013891 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.013927 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.013935 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.013949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.013957 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.118325 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.118408 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.118428 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.118459 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.118481 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.222275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.222362 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.222381 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.222411 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.222434 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.325812 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.325948 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.325971 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.326006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.326027 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.428976 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.429021 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.429034 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.429069 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.429081 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.532386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.532449 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.532465 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.532488 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.532504 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.635949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.636008 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.636026 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.636049 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.636071 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.748603 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.748678 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.748703 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.748733 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.748755 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.851712 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.851750 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.851761 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.851775 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.851786 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.955335 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.955410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.955437 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.955465 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:20 crc kubenswrapper[4725]: I0225 10:54:20.955489 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:20Z","lastTransitionTime":"2026-02-25T10:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.058243 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.058280 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.058291 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.058315 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.058326 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.160512 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.160569 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.160581 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.160599 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.160612 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.223936 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.223994 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.224013 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:21 crc kubenswrapper[4725]: E0225 10:54:21.224107 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.224159 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:21 crc kubenswrapper[4725]: E0225 10:54:21.224318 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:21 crc kubenswrapper[4725]: E0225 10:54:21.224465 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:21 crc kubenswrapper[4725]: E0225 10:54:21.224652 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.264292 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.264365 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.264389 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.264420 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.264445 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.368123 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.368199 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.368218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.368248 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.368269 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.472077 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.472188 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.472204 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.472229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.472245 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.524899 4725 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.575718 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.575766 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.575777 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.575801 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.575813 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.677770 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.677809 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.677817 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.677845 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.677857 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.781036 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.781090 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.781108 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.781130 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.781147 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.884879 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.884952 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.884971 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.884994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.885012 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.988565 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.988639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.988657 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.988680 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:21 crc kubenswrapper[4725]: I0225 10:54:21.988696 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:21Z","lastTransitionTime":"2026-02-25T10:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.090961 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.090997 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.091009 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.091024 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.091036 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.193949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.194027 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.194051 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.194081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.194101 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.296938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.297009 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.297023 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.297038 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.297048 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.400209 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.400263 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.400276 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.400295 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.400308 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.503781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.503923 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.503938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.503963 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.503977 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.607196 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.607271 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.607285 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.607303 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.607320 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.710614 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.710715 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.710733 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.710764 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.710782 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.814410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.814467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.814477 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.814497 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.814507 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.918255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.918319 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.918338 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.918363 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:22 crc kubenswrapper[4725]: I0225 10:54:22.918380 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:22Z","lastTransitionTime":"2026-02-25T10:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.021693 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.021750 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.021788 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.021817 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.021890 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.124596 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.124721 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.124745 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.124780 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.124801 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.224100 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.224541 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.224810 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.224907 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:23 crc kubenswrapper[4725]: E0225 10:54:23.224934 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:23 crc kubenswrapper[4725]: E0225 10:54:23.225161 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:23 crc kubenswrapper[4725]: E0225 10:54:23.225329 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:23 crc kubenswrapper[4725]: E0225 10:54:23.225483 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.228335 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.228371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.228386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.228406 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.228420 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.331529 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.331577 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.331610 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.331630 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.331643 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.434757 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.435328 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.435342 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.435365 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.435379 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.538569 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.538621 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.538631 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.538653 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.538667 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.616717 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8zw9d" event={"ID":"b4a262bc-bc77-471f-91d7-58fb221fa404","Type":"ContainerStarted","Data":"90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.618512 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9989l" event={"ID":"9de69f49-3e33-4721-9fee-ad2fc45b16bf","Type":"ContainerStarted","Data":"289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.620581 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.620694 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.635785 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.643164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.643218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.643229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.643257 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.643270 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.650326 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.664512 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.676993 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.687197 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.703660 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.719666 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.732330 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.746953 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.747015 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.747035 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.747061 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.747081 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.751224 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.797980 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.818318 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.842425 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.849307 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.849335 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.849344 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.849358 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.849368 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.853779 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.864486 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.886428 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.900602 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.914525 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.923713 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.935081 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.946388 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.951844 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.951896 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.951961 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.951984 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.952000 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:23Z","lastTransitionTime":"2026-02-25T10:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.961874 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.972513 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.980623 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.989431 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:23 crc kubenswrapper[4725]: I0225 10:54:23.996099 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.017430 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.027888 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.039049 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.051205 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.054097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.054137 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.054146 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.054161 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.054173 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.061913 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.156941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.156989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.157002 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.157020 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.157034 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.259322 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.259360 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.259371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.259388 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.259398 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.361982 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.362033 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.362045 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.362060 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.362072 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.464916 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.464957 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.464968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.464983 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.464992 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.567273 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.567336 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.567353 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.567379 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.567414 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.670013 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.670404 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.670413 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.670427 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.670436 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.772115 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.772193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.772208 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.772239 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.772255 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.875747 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.875794 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.875806 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.875850 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.875866 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.978765 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.978804 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.978820 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.978857 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:24 crc kubenswrapper[4725]: I0225 10:54:24.978867 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:24Z","lastTransitionTime":"2026-02-25T10:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.082656 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.082726 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.082740 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.082764 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.082780 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.187429 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.187481 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.187490 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.187507 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.187520 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.207433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.207482 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.207493 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.207512 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.207527 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.219930 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.223466 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.223604 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.224087 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.224146 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.224306 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.224330 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.224539 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.224694 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.226397 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.226428 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.226439 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.226458 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.226469 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.241853 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.247996 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.252557 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.253994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.253646 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.254034 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.254219 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.254248 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.265720 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.270770 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.275941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.276013 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.276032 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.276057 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.276076 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.283557 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.290610 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.294225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.294268 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.294283 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.294303 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.294314 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.298643 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.307069 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.307192 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.310565 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.311675 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.311715 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.311725 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.311752 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.311764 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.328537 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.340874 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.364241 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.373553 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.385641 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: E0225 10:54:25.394476 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720.scope\": RecentStats: unable to find data in memory cache]" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.411803 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.415534 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.415595 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.415609 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.415629 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.415643 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.424426 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.435012 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.442286 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.519567 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.519605 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.519615 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.519632 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.519645 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.623008 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.623044 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.623060 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.623077 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.623089 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.627630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerStarted","Data":"585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.629731 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.629765 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.639160 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.651092 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.668710 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.677472 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.686511 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.694446 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.708357 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.718661 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.725395 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.725449 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.725463 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.725487 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.725501 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.730165 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.742227 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.754399 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.763254 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.774199 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.786084 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.796629 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.807533 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.817713 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.828055 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.830007 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.830079 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.830098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.830124 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.830142 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.840556 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.850188 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.861489 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.873705 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.885394 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.894792 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.905011 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.916583 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.932604 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.932566 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.932663 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.932871 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.932915 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.932939 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:25Z","lastTransitionTime":"2026-02-25T10:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.945636 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.953948 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:25 crc kubenswrapper[4725]: I0225 10:54:25.961368 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.036162 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.036202 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.036210 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.036225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.036233 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.139049 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.139099 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.139116 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.139138 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.139168 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.242733 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.243944 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.243958 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.243978 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.243991 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.347015 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.347075 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.347098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.347126 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.347146 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.454214 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.454257 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.454268 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.454286 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.454301 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.556994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.557035 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.557043 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.557056 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.557067 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.634622 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c8d8877-1961-407f-b4a7-66e55321a6eb" containerID="585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc" exitCode=0 Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.634720 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerDied","Data":"585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.637385 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.653567 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.659371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.659401 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.659411 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.659441 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.659451 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.678467 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.699011 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.722464 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.740024 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.757139 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.764217 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.764245 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.764255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.764270 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.764280 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.777448 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.796445 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.810214 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.828526 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.845008 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.864168 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.867639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.867685 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.867701 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.867723 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.867737 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.877981 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.892301 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.909202 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.909298 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.909357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:26 crc kubenswrapper[4725]: E0225 10:54:26.909406 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:54:42.909388519 +0000 UTC m=+108.407970544 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:54:26 crc kubenswrapper[4725]: E0225 10:54:26.909453 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:26 crc kubenswrapper[4725]: E0225 10:54:26.909510 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:26 crc kubenswrapper[4725]: E0225 10:54:26.909518 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:42.909506592 +0000 UTC m=+108.408088617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:26 crc kubenswrapper[4725]: E0225 10:54:26.909538 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:42.909531953 +0000 UTC m=+108.408113968 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.914264 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.926006 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.935998 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.945416 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.963407 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.969662 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.969706 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.969718 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.969733 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.969762 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:26Z","lastTransitionTime":"2026-02-25T10:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.975895 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:26 crc kubenswrapper[4725]: I0225 10:54:26.986573 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.001842 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.010197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.010237 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.010269 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010368 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010378 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010393 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010427 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:43.010411379 +0000 UTC m=+108.508993404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010430 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010448 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010472 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:43.010466281 +0000 UTC m=+108.509048296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010398 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010496 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.010542 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:54:43.010526942 +0000 UTC m=+108.509108967 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.016019 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.025856 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.042146 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.058122 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.072227 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.072282 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.072297 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.072319 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.072332 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.075548 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.091564 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.108753 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.122714 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.174850 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.174898 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.174909 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.174928 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.174940 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.224418 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.224455 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.224545 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.224527 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.224667 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.224931 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.225198 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:27 crc kubenswrapper[4725]: E0225 10:54:27.225330 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.290873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.290933 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.290948 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.290973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.290992 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.394231 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.394616 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.394628 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.394645 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.394657 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.497521 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.497582 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.497608 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.497637 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.497658 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.600882 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.600915 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.600925 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.600942 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.600968 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.643115 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c8d8877-1961-407f-b4a7-66e55321a6eb" containerID="629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a" exitCode=0 Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.643213 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerDied","Data":"629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.645250 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.648438 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" event={"ID":"8f769618-965f-430a-8f67-e1ef4d94a063","Type":"ContainerStarted","Data":"1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.648514 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" event={"ID":"8f769618-965f-430a-8f67-e1ef4d94a063","Type":"ContainerStarted","Data":"36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.650351 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerStarted","Data":"5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.673303 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.685598 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.701562 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.703903 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.703941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.703955 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.703973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.703985 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.713491 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.723585 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.741603 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.760086 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.780964 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.800512 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.806931 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.806968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.806977 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.806997 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.807006 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.810710 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.823787 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.836346 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.848643 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.861485 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.874699 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.890205 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.904058 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.909426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.909461 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.909470 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.909484 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.909492 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:27Z","lastTransitionTime":"2026-02-25T10:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.917391 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.930808 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.947298 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.961572 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.973683 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.986622 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:27 crc kubenswrapper[4725]: I0225 10:54:27.999150 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.008641 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.011553 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.011641 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.011705 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.011792 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.011874 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.023742 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.042575 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.055171 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.067879 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.081038 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.114361 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.114416 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.114437 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.114462 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.114482 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.216644 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.216684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.216694 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.216709 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.216721 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.319708 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.319779 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.319788 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.319819 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.319851 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.422321 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.422361 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.422374 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.422393 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.422403 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.519791 4725 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.524702 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.524763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.524782 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.524806 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.524823 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.627116 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.627647 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.627877 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.628045 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.628178 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.657089 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c8d8877-1961-407f-b4a7-66e55321a6eb" containerID="e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a" exitCode=0 Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.657426 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerDied","Data":"e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.659521 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b" exitCode=0 Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.659627 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.671923 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.693295 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.706081 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.717147 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.731625 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.731672 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.731689 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.731713 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.731735 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.732878 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.743882 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.757479 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.772051 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.787455 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.803709 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.818811 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.834878 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.834928 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.834942 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.834961 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.834973 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.839445 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.856889 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.879880 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.891816 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.906107 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.923562 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.937713 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.937763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.937780 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.937805 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.937817 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:28Z","lastTransitionTime":"2026-02-25T10:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.942478 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.959616 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.973697 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:28 crc kubenswrapper[4725]: I0225 10:54:28.986048 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:28Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.002848 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.015869 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.035486 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.040165 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.040218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.040229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.040246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.040257 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.054916 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.074534 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.087007 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.100275 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.112447 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.127732 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.142453 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.142494 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.142503 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.142521 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.142530 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.223181 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.223219 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:29 crc kubenswrapper[4725]: E0225 10:54:29.223733 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.223405 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.223244 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:29 crc kubenswrapper[4725]: E0225 10:54:29.223884 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:29 crc kubenswrapper[4725]: E0225 10:54:29.223990 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:29 crc kubenswrapper[4725]: E0225 10:54:29.224167 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.246205 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.246267 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.246279 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.246298 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.246312 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.349328 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.349374 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.349385 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.349404 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.349417 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.451366 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.451407 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.451421 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.451435 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.451448 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.554323 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.554401 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.554413 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.554434 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.554447 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.657455 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.657503 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.657514 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.657531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.657541 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.669008 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.669068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.669082 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.669095 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.669107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.669116 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.672703 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c8d8877-1961-407f-b4a7-66e55321a6eb" containerID="23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2" exitCode=0 Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.672738 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerDied","Data":"23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.699085 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.725787 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.742293 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.754347 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.759646 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.759719 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.759734 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.759758 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.759776 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.766085 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.785202 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.795468 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.805632 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.816846 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.826940 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.846576 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.859972 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.862394 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.862417 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.862424 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.862438 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.862448 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.872648 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.884018 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.899812 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:29Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.965108 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.965160 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.965173 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.965191 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:29 crc kubenswrapper[4725]: I0225 10:54:29.965207 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:29Z","lastTransitionTime":"2026-02-25T10:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.068313 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.068377 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.068395 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.068426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.068446 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.171042 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.171081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.171089 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.171102 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.171111 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.275808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.275941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.275966 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.276406 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.276750 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.379519 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.379573 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.379587 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.379603 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.379616 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.482255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.482299 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.482309 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.482322 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.482335 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.585322 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.585378 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.585392 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.585420 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.585436 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.681629 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c8d8877-1961-407f-b4a7-66e55321a6eb" containerID="7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609" exitCode=0 Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.681697 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerDied","Data":"7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.687854 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.687913 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.687938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.687968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.687986 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.703992 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.723521 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.739341 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.760694 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.776791 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.791014 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.791054 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.791065 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.791083 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.791100 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.794299 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.808135 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.821910 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.834458 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.847615 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.871700 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.892728 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.895026 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.895105 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.896014 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.896064 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.896081 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.914871 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.930285 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.945661 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:30Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.999299 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.999386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.999406 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.999438 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:30 crc kubenswrapper[4725]: I0225 10:54:30.999468 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:30Z","lastTransitionTime":"2026-02-25T10:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.102319 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.102385 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.102403 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.102427 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.102447 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.205180 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.205229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.205246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.205269 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.205286 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.223965 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.224136 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.224257 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:31 crc kubenswrapper[4725]: E0225 10:54:31.224250 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:31 crc kubenswrapper[4725]: E0225 10:54:31.224652 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:31 crc kubenswrapper[4725]: E0225 10:54:31.224732 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.225872 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:31 crc kubenswrapper[4725]: E0225 10:54:31.226102 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.309178 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.309280 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.309298 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.309371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.309389 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.411959 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.412030 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.412055 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.412088 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.412114 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.516205 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.516289 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.516314 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.516346 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.516381 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.618913 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.618952 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.618960 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.618975 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.618984 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.709314 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.713701 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c8d8877-1961-407f-b4a7-66e55321a6eb" containerID="7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e" exitCode=0 Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.713777 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerDied","Data":"7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.721335 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.721387 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.721400 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.721421 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.721436 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.732632 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.747510 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.760919 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.776644 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.789074 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.798877 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.811106 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.824379 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.824427 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.824441 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.824462 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.824480 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.826437 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.839238 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.850401 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.867440 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.886620 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.904298 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.921165 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.929136 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.929179 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.929191 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.929211 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.929224 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:31Z","lastTransitionTime":"2026-02-25T10:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:31 crc kubenswrapper[4725]: I0225 10:54:31.932262 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:31Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.033118 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.033551 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.033561 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.033576 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.033585 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.136538 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.136579 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.136591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.136607 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.136620 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.224951 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:54:32 crc kubenswrapper[4725]: E0225 10:54:32.225210 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.239179 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.239242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.239266 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.239294 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.239316 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.341599 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.341659 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.341679 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.341704 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.341723 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.444010 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.444051 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.444060 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.444074 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.444083 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.547936 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.547990 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.548006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.548032 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.548049 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.651652 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.651721 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.651745 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.651770 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.651788 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.727622 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" event={"ID":"0c8d8877-1961-407f-b4a7-66e55321a6eb","Type":"ContainerStarted","Data":"a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.741769 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.754290 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.755716 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.755764 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.755776 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.755797 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.755808 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.768735 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.779658 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.794621 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.811718 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.824909 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.840688 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.853636 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.858418 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.858453 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.858464 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.858482 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.858495 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.864469 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.884683 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.900435 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.916420 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.929764 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.951203 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:32Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.960675 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.960727 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.960739 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.960762 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:32 crc kubenswrapper[4725]: I0225 10:54:32.960776 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:32Z","lastTransitionTime":"2026-02-25T10:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.063531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.063568 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.063581 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.063598 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.063610 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.166187 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.166231 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.166246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.166264 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.166278 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.224380 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.224385 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:33 crc kubenswrapper[4725]: E0225 10:54:33.224528 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.224401 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:33 crc kubenswrapper[4725]: E0225 10:54:33.224608 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.224379 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:33 crc kubenswrapper[4725]: E0225 10:54:33.224684 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:33 crc kubenswrapper[4725]: E0225 10:54:33.224739 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.269150 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.269197 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.269214 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.269234 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.269247 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.371476 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.371520 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.371535 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.371556 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.371572 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.475123 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.475162 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.475172 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.475191 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.475202 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.578123 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.578187 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.578206 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.578232 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.578251 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.682361 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.683697 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.683730 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.683762 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.683785 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.738348 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.738798 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.738858 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.763727 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.775674 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.778420 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.786754 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.786795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.786808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.786842 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.786854 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.796978 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.812560 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.830281 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.844071 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.865371 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.887470 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.889355 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.889397 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.889409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.889430 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.889444 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.906428 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.923451 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.938795 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.957350 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.974700 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.992383 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.992460 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.992475 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.992492 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.992532 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:33Z","lastTransitionTime":"2026-02-25T10:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:33 crc kubenswrapper[4725]: I0225 10:54:33.992930 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:33Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.015579 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.036653 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.054518 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.079285 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.099552 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.099622 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.099633 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.099651 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.099685 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.101488 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.118257 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.133921 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.147991 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.164365 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.183996 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.194971 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.204208 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.204258 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.204276 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.204305 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.204322 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.218713 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.245480 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.308134 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.308198 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.308217 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.308245 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.308267 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.308335 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.323707 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.335377 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.411193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.411250 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.411267 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.411290 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.411317 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.514865 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.514916 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.514927 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.514947 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.514960 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.618078 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.618177 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.618206 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.618238 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.618264 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.721321 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.721364 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.721378 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.721394 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.721408 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.742908 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.768546 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.799160 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.823113 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.823898 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.824022 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.824111 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.824199 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.824271 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.838715 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.854073 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.865503 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.884545 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.900064 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.912142 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.922995 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.926332 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.926409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.926431 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.926461 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.926483 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:34Z","lastTransitionTime":"2026-02-25T10:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.934756 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.950306 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.966453 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.982399 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:34 crc kubenswrapper[4725]: I0225 10:54:34.998121 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:34Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.009485 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.028801 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.028880 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.028897 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.028920 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.028936 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.131949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.132018 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.132044 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.132076 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.132102 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.223590 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.223651 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.223678 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.223785 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.224056 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.224015 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.224185 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.224338 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.234441 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.234504 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.234521 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.234557 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.234574 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.249998 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.267700 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.284326 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.321675 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.336693 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.336752 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.336771 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.336796 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.336817 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.341818 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.362579 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.381606 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.397081 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.408539 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.422327 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.429542 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.429566 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.429573 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.429586 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.429594 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.436615 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.447139 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.451697 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.458763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.458794 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.458805 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.458820 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.458845 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.471729 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.473644 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.475735 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.475802 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.475816 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.475848 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.475862 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.487587 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.490561 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.491435 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.491467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.491478 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.491493 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.491504 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.509092 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.509686 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.517949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.517988 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.518017 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.518032 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.518044 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.538596 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:35 crc kubenswrapper[4725]: E0225 10:54:35.538761 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.540391 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.540419 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.540432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.540448 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.540458 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.643060 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.643124 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.643147 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.643177 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.643198 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.745278 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.745319 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.745334 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.745350 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.745362 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.854396 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.854504 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.854530 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.855044 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.855060 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.959030 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.959082 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.959097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.959121 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:35 crc kubenswrapper[4725]: I0225 10:54:35.959136 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:35Z","lastTransitionTime":"2026-02-25T10:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.063013 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.063081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.063098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.063122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.063138 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.166221 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.166266 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.166277 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.166292 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.166303 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.268916 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.268993 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.269019 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.269048 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.269073 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.372283 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.372353 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.372378 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.372410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.372429 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.475157 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.475227 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.475250 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.475277 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.475299 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.578132 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.578193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.578210 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.578232 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.578248 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.681274 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.681448 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.681467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.681490 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.681507 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.752779 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/0.log" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.757019 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b" exitCode=1 Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.757102 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.758486 4725 scope.go:117] "RemoveContainer" containerID="ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.775265 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.789890 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.789959 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.789974 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.790004 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.790022 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.798872 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.824390 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.838672 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.852516 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.873707 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.888816 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.894422 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.894488 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.894505 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.894531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.894548 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.904877 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.920663 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.954728 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:36Z\\\",\\\"message\\\":\\\"s/informers/externalversions/factory.go:140\\\\nI0225 10:54:36.221704 6640 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 10:54:36.221919 6640 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 10:54:36.222917 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 10:54:36.222974 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 10:54:36.223016 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0225 10:54:36.223045 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0225 10:54:36.223085 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 10:54:36.223128 6640 factory.go:656] Stopping watch factory\\\\nI0225 10:54:36.223167 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0225 10:54:36.223235 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 10:54:36.223274 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 10:54:36.223311 6640 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 10:54:36.223365 6640 handler.go:208] Removed *v1.Namespace event ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.972050 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.992677 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:36Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.997558 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.997598 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.997610 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.997625 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:36 crc kubenswrapper[4725]: I0225 10:54:36.997639 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:36Z","lastTransitionTime":"2026-02-25T10:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.011335 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.022388 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.037722 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.100069 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.100133 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.100165 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.100197 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.100238 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.203216 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.203263 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.203277 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.203297 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.203311 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.223805 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.223968 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.223971 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.223820 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:37 crc kubenswrapper[4725]: E0225 10:54:37.224110 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:37 crc kubenswrapper[4725]: E0225 10:54:37.224263 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:37 crc kubenswrapper[4725]: E0225 10:54:37.224445 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:37 crc kubenswrapper[4725]: E0225 10:54:37.224541 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.306856 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.306905 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.306918 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.306959 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.306972 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.409855 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.409910 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.409922 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.409942 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.409956 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.512389 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.512446 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.512458 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.512483 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.512495 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.615384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.615430 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.615442 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.615460 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.615473 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.717994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.718044 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.718053 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.718067 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.718077 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.762080 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/0.log" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.767061 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.767428 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.781418 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.796659 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.814334 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:36Z\\\",\\\"message\\\":\\\"s/informers/externalversions/factory.go:140\\\\nI0225 10:54:36.221704 6640 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 10:54:36.221919 6640 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 10:54:36.222917 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 10:54:36.222974 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 10:54:36.223016 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0225 10:54:36.223045 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0225 10:54:36.223085 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 10:54:36.223128 6640 factory.go:656] Stopping watch factory\\\\nI0225 10:54:36.223167 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0225 10:54:36.223235 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 10:54:36.223274 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 10:54:36.223311 6640 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 10:54:36.223365 6640 handler.go:208] Removed *v1.Namespace event ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.820756 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.820787 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.820796 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.820810 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.820819 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.826182 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.839422 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.852368 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.863918 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.873978 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.886504 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.898936 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.910268 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.919157 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.922878 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.922908 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.922918 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.922934 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.922946 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:37Z","lastTransitionTime":"2026-02-25T10:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.931722 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.941387 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:37 crc kubenswrapper[4725]: I0225 10:54:37.951518 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.025884 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.025920 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.025928 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.025941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.025949 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.128963 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.129018 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.129036 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.129058 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.129074 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.232086 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.232163 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.232193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.232224 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.232247 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.245274 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.334348 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.334397 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.334408 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.334426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.334437 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.437190 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.437235 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.437246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.437260 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.437271 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.540926 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.540994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.541019 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.541047 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.541069 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.645082 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.645181 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.645212 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.645242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.645264 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.749781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.750303 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.750327 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.750358 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.750380 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.774695 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/1.log" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.775700 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/0.log" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.779610 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c" exitCode=1 Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.779670 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.779915 4725 scope.go:117] "RemoveContainer" containerID="ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.781519 4725 scope.go:117] "RemoveContainer" containerID="5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c" Feb 25 10:54:38 crc kubenswrapper[4725]: E0225 10:54:38.781797 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.817205 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.843249 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.854518 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.854612 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.854638 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.854673 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.854696 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.872406 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.889614 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.921080 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6fc43e8e3c0fb21e96be6463dd601b0a3bbf2101b5bc47af52899c6e052d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:36Z\\\",\\\"message\\\":\\\"s/informers/externalversions/factory.go:140\\\\nI0225 10:54:36.221704 6640 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 10:54:36.221919 6640 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 10:54:36.222917 6640 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 10:54:36.222974 6640 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 10:54:36.223016 6640 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0225 10:54:36.223045 6640 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0225 10:54:36.223085 6640 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 10:54:36.223128 6640 factory.go:656] Stopping watch factory\\\\nI0225 10:54:36.223167 6640 ovnkube.go:599] Stopped ovnkube\\\\nI0225 10:54:36.223235 6640 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 10:54:36.223274 6640 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 10:54:36.223311 6640 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 10:54:36.223365 6640 handler.go:208] Removed *v1.Namespace event ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:37Z\\\",\\\"message\\\":\\\"generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 10:54:37.868565 6756 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:37.868568 6756 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.937590 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.950521 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.957435 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.957489 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.957504 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.957525 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.957542 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:38Z","lastTransitionTime":"2026-02-25T10:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.973676 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:38 crc kubenswrapper[4725]: I0225 10:54:38.993494 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:38Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.017745 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.037953 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.058149 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.061315 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.061413 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.061440 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.061478 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.061509 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.079735 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.093787 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.114619 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.135566 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.164526 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.164591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.164608 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.164633 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.164652 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.223291 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.223370 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.223388 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.223422 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:39 crc kubenswrapper[4725]: E0225 10:54:39.223588 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:39 crc kubenswrapper[4725]: E0225 10:54:39.223674 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:39 crc kubenswrapper[4725]: E0225 10:54:39.223780 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:39 crc kubenswrapper[4725]: E0225 10:54:39.223863 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.267958 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.268053 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.268071 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.268097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.268113 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.370788 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.370896 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.370916 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.370938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.370954 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.474512 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.474591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.474615 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.474642 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.474663 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.578068 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.578131 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.578148 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.578177 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.578197 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.680702 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.680771 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.680785 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.680807 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.680822 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.783431 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.783492 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.783509 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.783532 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.783547 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.786412 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/1.log" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.791040 4725 scope.go:117] "RemoveContainer" containerID="5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c" Feb 25 10:54:39 crc kubenswrapper[4725]: E0225 10:54:39.791236 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.804657 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.817850 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.832655 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.846619 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.866155 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.884477 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.888340 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.888411 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.888430 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.888455 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.888475 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.897563 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.912934 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.932419 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.968082 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.993058 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.993126 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.993149 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.993180 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.993201 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:39Z","lastTransitionTime":"2026-02-25T10:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:39 crc kubenswrapper[4725]: I0225 10:54:39.993403 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:39Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.008029 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:40Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.022711 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:40Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.037469 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:40Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.050807 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:40Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.074175 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:37Z\\\",\\\"message\\\":\\\"generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 10:54:37.868565 6756 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:37.868568 6756 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:40Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.097379 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.097474 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.097500 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.097533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.097554 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.200738 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.200873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.200897 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.200939 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.200981 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.305366 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.306234 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.306499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.306687 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.306868 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.410496 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.411034 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.411187 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.411324 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.411457 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.514758 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.515265 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.515372 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.515465 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.515561 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.619147 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.619213 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.619233 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.619263 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.619285 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.722757 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.722858 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.722885 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.722914 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.722935 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.826023 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.826079 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.826091 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.826114 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.826128 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.929202 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.929257 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.929279 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.929309 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:40 crc kubenswrapper[4725]: I0225 10:54:40.929331 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:40Z","lastTransitionTime":"2026-02-25T10:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.033224 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.033274 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.033292 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.033316 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.033334 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.136873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.136943 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.137000 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.137023 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.137040 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.223288 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.223330 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.223320 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.223421 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:41 crc kubenswrapper[4725]: E0225 10:54:41.223639 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:41 crc kubenswrapper[4725]: E0225 10:54:41.224001 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:41 crc kubenswrapper[4725]: E0225 10:54:41.223783 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:41 crc kubenswrapper[4725]: E0225 10:54:41.224192 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.239571 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.239659 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.239709 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.239735 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.239752 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.343698 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.343906 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.344036 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.344068 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.344078 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.446648 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.446708 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.446726 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.446748 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.446767 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.550339 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.550400 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.550419 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.550443 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.550460 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.653329 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.653383 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.653403 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.653427 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.653444 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.756084 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.756140 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.756156 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.756201 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.756219 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.860255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.860342 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.860363 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.860388 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.860408 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.963745 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.963809 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.963856 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.963887 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:41 crc kubenswrapper[4725]: I0225 10:54:41.963910 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:41Z","lastTransitionTime":"2026-02-25T10:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.065768 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.065822 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.065867 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.065887 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.065902 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.169447 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.169549 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.169571 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.169596 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.169613 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.273658 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.273738 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.273762 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.273790 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.273810 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.376595 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.376642 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.376655 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.376673 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.376687 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.481042 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.481103 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.481120 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.481142 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.481159 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.584533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.584598 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.584612 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.584634 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.584649 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.687995 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.688043 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.688059 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.688081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.688097 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.791939 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.792025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.792045 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.792069 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.792091 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.895364 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.895423 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.895440 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.895463 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.895482 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.998814 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.998923 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.998936 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:42 crc kubenswrapper[4725]: I0225 10:54:42.998956 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:42.998968 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:42Z","lastTransitionTime":"2026-02-25T10:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:42.999601 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:42.999800 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:55:14.999736564 +0000 UTC m=+140.498318599 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:42.999894 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.000019 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.000123 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.000176 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:55:15.000165235 +0000 UTC m=+140.498747290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.000187 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.000285 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:55:15.000254348 +0000 UTC m=+140.498836443 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.101479 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.101573 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.101662 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.101870 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.101899 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.101916 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.101950 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.101952 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.101965 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.101977 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.102018 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:55:15.101954516 +0000 UTC m=+140.600536581 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.102107 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:55:15.102077189 +0000 UTC m=+140.600659264 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.102155 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:55:15.1021285 +0000 UTC m=+140.600710565 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.102667 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.102735 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.102764 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.102817 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.102884 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.205710 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.205756 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.205773 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.205795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.205814 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.223803 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.223899 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.223954 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.223803 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.224089 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.224184 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.224363 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:43 crc kubenswrapper[4725]: E0225 10:54:43.226359 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.309428 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.309481 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.309494 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.309513 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.309527 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.412705 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.412781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.412801 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.412866 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.412893 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.516193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.516259 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.516276 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.516299 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.516318 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.619752 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.619823 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.619881 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.619916 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.619937 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.723907 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.723971 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.723993 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.724021 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.724042 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.826898 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.826973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.826999 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.827031 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.827061 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.930402 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.930510 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.930535 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.930563 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:43 crc kubenswrapper[4725]: I0225 10:54:43.930582 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:43Z","lastTransitionTime":"2026-02-25T10:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.034301 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.034392 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.034410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.034432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.034450 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.137563 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.137636 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.137660 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.137688 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.137708 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.241171 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.241249 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.241283 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.241309 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.241329 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.344083 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.344135 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.344149 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.344171 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.344185 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.447804 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.447943 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.447968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.448001 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.448029 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.551410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.551461 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.551472 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.551488 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.551499 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.654362 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.654432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.654454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.654486 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.654507 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.758152 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.758239 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.758260 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.758288 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.758309 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.867989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.868060 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.868079 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.868109 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.868129 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.970811 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.970926 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.970949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.971020 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:44 crc kubenswrapper[4725]: I0225 10:54:44.971047 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:44Z","lastTransitionTime":"2026-02-25T10:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.073985 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.074071 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.074090 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.074122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.074145 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.177527 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.177573 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.177581 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.177597 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.177609 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.223597 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.223687 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.223821 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.223925 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.224210 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.224289 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.224669 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.224884 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.225183 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.248388 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.268495 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.279499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.279567 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.279585 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.279610 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.279627 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.288113 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.300817 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.316436 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.340331 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:37Z\\\",\\\"message\\\":\\\"generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 10:54:37.868565 6756 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:37.868568 6756 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.354960 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.369736 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.382316 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.382343 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.382350 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.382363 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.382397 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.383614 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.399612 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.414431 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.430864 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.444150 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.460751 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.471436 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.485242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.485361 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.485441 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.485531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.485627 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.490527 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.588359 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.588418 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.588437 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.588462 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.588480 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.691605 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.691804 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.691870 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.691901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.691925 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.693430 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.693488 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.693505 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.693526 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.693542 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.715611 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.725631 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.725681 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.725699 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.725722 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.725740 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.747467 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.752089 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.752143 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.752164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.752191 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.752213 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.774781 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.780670 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.780730 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.780750 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.780775 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.780794 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.802071 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.807970 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.808045 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.808065 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.808091 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.808110 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.816136 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.819347 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.819914 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.830309 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: E0225 10:54:45.830532 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.833036 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.833090 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.833128 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.833158 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.833178 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.838318 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.852691 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.872774 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.888950 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.904280 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.936070 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.936161 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.936184 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.936219 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.936243 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:45Z","lastTransitionTime":"2026-02-25T10:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.937743 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.961975 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.975369 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:45 crc kubenswrapper[4725]: I0225 10:54:45.991149 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:45Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.003775 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:46Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.018206 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:46Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.039614 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.039706 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.039729 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.039760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.039783 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.082152 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:37Z\\\",\\\"message\\\":\\\"generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 10:54:37.868565 6756 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:37.868568 6756 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:46Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.098779 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:46Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.114419 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:46Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.126001 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:46Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.136240 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:46Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.142169 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.142207 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.142217 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.142234 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.142245 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.245022 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.245058 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.245069 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.245086 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.245097 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.347165 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.347229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.347246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.347272 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.347290 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.450432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.450495 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.450513 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.450547 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.450584 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.554122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.554208 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.554233 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.554266 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.554292 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.657247 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.657321 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.657335 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.657355 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.657369 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.760684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.760755 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.760775 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.760808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.760880 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.863789 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.863884 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.863900 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.863921 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.863935 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.969062 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.969129 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.969150 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.969175 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:46 crc kubenswrapper[4725]: I0225 10:54:46.969201 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:46Z","lastTransitionTime":"2026-02-25T10:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.072217 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.072255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.072269 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.072285 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.072295 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.180471 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.180542 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.180562 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.180585 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.180603 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.223791 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.223925 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:47 crc kubenswrapper[4725]: E0225 10:54:47.224015 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.224086 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.223805 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:47 crc kubenswrapper[4725]: E0225 10:54:47.224243 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:47 crc kubenswrapper[4725]: E0225 10:54:47.224417 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:47 crc kubenswrapper[4725]: E0225 10:54:47.224569 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.283902 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.283983 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.284008 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.284039 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.284061 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.387271 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.387406 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.387426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.387453 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.387472 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.491103 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.491152 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.491168 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.491190 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.491206 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.594516 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.594589 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.594622 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.594650 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.594672 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.697573 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.697631 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.697648 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.697671 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.697688 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.799592 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.799643 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.799658 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.799679 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.799691 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.905279 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.905321 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.905336 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.905353 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:47 crc kubenswrapper[4725]: I0225 10:54:47.905397 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:47Z","lastTransitionTime":"2026-02-25T10:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.009566 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.009698 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.009728 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.009761 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.009788 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.113114 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.113191 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.113220 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.113252 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.113276 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.216584 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.216643 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.216661 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.216717 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.216730 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.319152 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.319229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.319250 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.319281 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.319303 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.422606 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.422670 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.422689 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.422734 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.422780 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.526278 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.526354 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.526372 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.526398 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.526426 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.629138 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.629197 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.629221 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.629246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.629265 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.732200 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.732267 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.732281 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.732301 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.732317 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.834903 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.834969 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.834989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.835013 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.835039 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.937206 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.937263 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.937280 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.937302 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:48 crc kubenswrapper[4725]: I0225 10:54:48.937319 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:48Z","lastTransitionTime":"2026-02-25T10:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.040197 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.040274 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.040294 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.040319 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.040394 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.143964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.144026 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.144040 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.144059 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.144070 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.223986 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:49 crc kubenswrapper[4725]: E0225 10:54:49.224126 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.223994 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:49 crc kubenswrapper[4725]: E0225 10:54:49.224338 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.224012 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.224008 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:49 crc kubenswrapper[4725]: E0225 10:54:49.224467 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:49 crc kubenswrapper[4725]: E0225 10:54:49.224542 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.246337 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.246395 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.246413 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.246436 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.246521 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.348701 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.348773 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.348795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.348872 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.348897 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.452295 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.452359 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.452381 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.452410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.452436 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.560948 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.561006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.561020 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.561040 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.561053 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.663623 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.663681 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.663698 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.663721 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.663737 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.766489 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.766542 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.766559 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.766581 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.766594 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.870371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.870432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.870452 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.870475 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.870492 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.973241 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.973307 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.973326 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.973351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:49 crc kubenswrapper[4725]: I0225 10:54:49.973369 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:49Z","lastTransitionTime":"2026-02-25T10:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.076108 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.076192 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.076215 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.076244 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.076268 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.179674 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.179742 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.179760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.179783 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.179800 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.283040 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.283088 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.283103 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.283122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.283135 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.386699 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.386769 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.386791 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.386819 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.386873 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.489516 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.489584 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.489608 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.489639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.489661 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.592422 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.592480 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.592492 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.592508 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.592518 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.695794 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.695857 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.695866 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.695880 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.695889 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.797914 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.797996 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.798019 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.798077 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.798096 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.900205 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.900242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.900258 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.900273 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:50 crc kubenswrapper[4725]: I0225 10:54:50.900285 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:50Z","lastTransitionTime":"2026-02-25T10:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.002396 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.002433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.002443 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.002458 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.002469 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.107213 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.107276 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.107297 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.107326 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.107348 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.211469 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.211573 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.211591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.211618 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.211634 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.223910 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:51 crc kubenswrapper[4725]: E0225 10:54:51.224059 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.224148 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.224158 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.224187 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:51 crc kubenswrapper[4725]: E0225 10:54:51.224529 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:51 crc kubenswrapper[4725]: E0225 10:54:51.224597 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:51 crc kubenswrapper[4725]: E0225 10:54:51.224693 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.240202 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.314246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.314355 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.314375 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.314402 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.314420 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.418087 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.418141 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.418159 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.418180 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.418194 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.521912 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.521987 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.522005 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.522030 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.522056 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.624989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.625055 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.625073 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.625097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.625115 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.729272 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.729654 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.729667 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.729686 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.729697 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.833255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.833327 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.833352 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.833380 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.833401 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.935434 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.935485 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.935499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.935517 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:51 crc kubenswrapper[4725]: I0225 10:54:51.935532 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:51Z","lastTransitionTime":"2026-02-25T10:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.039119 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.039202 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.039223 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.039246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.039260 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.142558 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.142618 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.142639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.142666 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.142688 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.225163 4725 scope.go:117] "RemoveContainer" containerID="5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.244777 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.244911 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.244938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.244965 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.244983 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.348640 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.348782 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.348808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.348867 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.348888 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.451512 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.451544 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.451553 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.451568 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.451577 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.555098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.555146 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.555157 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.555175 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.555187 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.657531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.657583 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.657596 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.657615 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.657630 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.759942 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.759989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.760006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.760028 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.760045 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.845888 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/1.log" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.848582 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.848942 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.860311 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.861474 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.861495 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.861504 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.861516 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.861525 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.870406 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.880257 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.901519 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:37Z\\\",\\\"message\\\":\\\"generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 10:54:37.868565 6756 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:37.868568 6756 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.915438 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.927291 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.941457 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.960380 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.964249 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.964292 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.964311 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.964332 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.964346 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:52Z","lastTransitionTime":"2026-02-25T10:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.975574 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:52 crc kubenswrapper[4725]: I0225 10:54:52.989170 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.000776 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:52Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.013232 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.025669 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.047091 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.058376 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.066706 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.066742 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.066752 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.066768 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.066779 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.072046 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.087743 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.169269 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.169307 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.169315 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.169330 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.169342 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.224311 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.224353 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.224395 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.224314 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:53 crc kubenswrapper[4725]: E0225 10:54:53.224478 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:53 crc kubenswrapper[4725]: E0225 10:54:53.224541 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:53 crc kubenswrapper[4725]: E0225 10:54:53.224621 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:53 crc kubenswrapper[4725]: E0225 10:54:53.224750 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.271898 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.271948 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.271956 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.271973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.271983 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.374210 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.374250 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.374259 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.374273 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.374283 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.476001 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.476042 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.476052 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.476071 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.476082 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.579398 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.579449 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.579460 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.579476 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.579488 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.682392 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.682427 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.682435 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.682448 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.682457 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.785416 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.785487 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.785513 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.785542 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.785559 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.854295 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/2.log" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.855130 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/1.log" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.858285 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899" exitCode=1 Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.858366 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.858429 4725 scope.go:117] "RemoveContainer" containerID="5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.859526 4725 scope.go:117] "RemoveContainer" containerID="5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899" Feb 25 10:54:53 crc kubenswrapper[4725]: E0225 10:54:53.859962 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.885295 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.888647 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.888710 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.888729 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.888759 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.888776 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.910770 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.925024 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.940789 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.958267 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.969075 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.983934 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.991311 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.991365 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.991375 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.991390 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.991400 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:53Z","lastTransitionTime":"2026-02-25T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:53 crc kubenswrapper[4725]: I0225 10:54:53.997952 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.008875 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.026793 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.041906 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.057327 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.067846 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.078027 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.089103 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.093552 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.093856 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.093866 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.093881 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.093891 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.098262 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.113798 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5771c5587452b85720d129057c992aad6c8492744289826a03443131cdafd53c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:37Z\\\",\\\"message\\\":\\\"generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 10:54:37.868565 6756 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:37Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:37.868568 6756 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.195926 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.195962 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.195971 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.195986 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.195995 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.299174 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.299252 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.299275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.299308 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.299331 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.401901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.401942 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.401951 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.401965 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.401974 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.505039 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.505076 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.505085 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.505098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.505109 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.607187 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.607226 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.607235 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.607249 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.607258 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.710066 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.710105 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.710114 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.710128 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.710136 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.812863 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.812920 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.812933 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.812953 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.812966 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.864764 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/2.log" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.869439 4725 scope.go:117] "RemoveContainer" containerID="5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899" Feb 25 10:54:54 crc kubenswrapper[4725]: E0225 10:54:54.869651 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.883970 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.897565 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.915697 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.916727 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.916788 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.916799 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.916818 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.916865 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:54Z","lastTransitionTime":"2026-02-25T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.930949 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.949613 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.961879 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.980014 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:54 crc kubenswrapper[4725]: I0225 10:54:54.991714 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:54Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.007084 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.019650 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.019704 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.019713 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.019731 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.019744 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:55Z","lastTransitionTime":"2026-02-25T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.031083 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.043900 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.058621 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.069380 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.081306 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.091139 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.102654 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.121946 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.121973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.121984 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.121999 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.122008 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:55Z","lastTransitionTime":"2026-02-25T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.122005 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: E0225 10:54:55.222672 4725 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.224029 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.224072 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.224087 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:55 crc kubenswrapper[4725]: E0225 10:54:55.224489 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.224510 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:55 crc kubenswrapper[4725]: E0225 10:54:55.224724 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:55 crc kubenswrapper[4725]: E0225 10:54:55.224856 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:55 crc kubenswrapper[4725]: E0225 10:54:55.225018 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.241249 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.258637 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.276003 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.287151 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.300231 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.316486 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: E0225 10:54:55.329603 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.333283 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.349466 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.372520 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.385734 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.397576 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.414067 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.428177 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.443683 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.453740 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.465330 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:55 crc kubenswrapper[4725]: I0225 10:54:55.477427 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:55Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.031322 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.031372 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.031384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.031400 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.031414 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:56Z","lastTransitionTime":"2026-02-25T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:56 crc kubenswrapper[4725]: E0225 10:54:56.050082 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:56Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.053855 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.053888 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.053901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.053916 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.053928 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:56Z","lastTransitionTime":"2026-02-25T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:56 crc kubenswrapper[4725]: E0225 10:54:56.066811 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:56Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.070168 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.070244 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.070256 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.070273 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.070283 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:56Z","lastTransitionTime":"2026-02-25T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:56 crc kubenswrapper[4725]: E0225 10:54:56.082893 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:56Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.086571 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.086608 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.086622 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.086639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.086653 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:56Z","lastTransitionTime":"2026-02-25T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:56 crc kubenswrapper[4725]: E0225 10:54:56.099604 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:56Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.104218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.104257 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.104270 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.104290 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:54:56 crc kubenswrapper[4725]: I0225 10:54:56.104302 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:54:56Z","lastTransitionTime":"2026-02-25T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:54:56 crc kubenswrapper[4725]: E0225 10:54:56.129282 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:56Z is after 2025-08-24T17:21:41Z" Feb 25 10:54:56 crc kubenswrapper[4725]: E0225 10:54:56.129456 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:54:57 crc kubenswrapper[4725]: I0225 10:54:57.223712 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:57 crc kubenswrapper[4725]: I0225 10:54:57.223758 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:57 crc kubenswrapper[4725]: I0225 10:54:57.223778 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:57 crc kubenswrapper[4725]: I0225 10:54:57.223729 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:57 crc kubenswrapper[4725]: E0225 10:54:57.223913 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:57 crc kubenswrapper[4725]: E0225 10:54:57.224035 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:57 crc kubenswrapper[4725]: E0225 10:54:57.224102 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:57 crc kubenswrapper[4725]: E0225 10:54:57.224177 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:54:59 crc kubenswrapper[4725]: I0225 10:54:59.223364 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:54:59 crc kubenswrapper[4725]: I0225 10:54:59.223419 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:54:59 crc kubenswrapper[4725]: I0225 10:54:59.223535 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:54:59 crc kubenswrapper[4725]: E0225 10:54:59.223524 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:54:59 crc kubenswrapper[4725]: E0225 10:54:59.223746 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:54:59 crc kubenswrapper[4725]: I0225 10:54:59.223392 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:54:59 crc kubenswrapper[4725]: E0225 10:54:59.223967 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:54:59 crc kubenswrapper[4725]: E0225 10:54:59.224136 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:00 crc kubenswrapper[4725]: E0225 10:55:00.331216 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:01 crc kubenswrapper[4725]: I0225 10:55:01.223641 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:01 crc kubenswrapper[4725]: I0225 10:55:01.223634 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:01 crc kubenswrapper[4725]: I0225 10:55:01.223653 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:01 crc kubenswrapper[4725]: E0225 10:55:01.223961 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:01 crc kubenswrapper[4725]: I0225 10:55:01.224014 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:01 crc kubenswrapper[4725]: E0225 10:55:01.224107 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:01 crc kubenswrapper[4725]: E0225 10:55:01.224342 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:01 crc kubenswrapper[4725]: E0225 10:55:01.224511 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.850442 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.895460 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:02Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.913417 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:02Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.931088 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:02Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.944745 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:02Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.962291 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:02Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.979439 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:02Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:02 crc kubenswrapper[4725]: I0225 10:55:02.997983 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:02Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.023733 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.045062 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.065972 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.083667 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.103788 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.120018 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.134093 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.161518 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.176154 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.192099 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:03Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.223270 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.223346 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.223290 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:03 crc kubenswrapper[4725]: I0225 10:55:03.223273 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:03 crc kubenswrapper[4725]: E0225 10:55:03.223499 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:03 crc kubenswrapper[4725]: E0225 10:55:03.223585 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:03 crc kubenswrapper[4725]: E0225 10:55:03.223732 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:03 crc kubenswrapper[4725]: E0225 10:55:03.223900 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.223475 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:05 crc kubenswrapper[4725]: E0225 10:55:05.224051 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.223514 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:05 crc kubenswrapper[4725]: E0225 10:55:05.224159 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.223600 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:05 crc kubenswrapper[4725]: E0225 10:55:05.224231 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.223483 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:05 crc kubenswrapper[4725]: E0225 10:55:05.224287 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.240241 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.257157 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.273624 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.287317 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.296861 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.307280 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.316798 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.330592 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: E0225 10:55:05.331597 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.341625 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.358049 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.368088 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.382044 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.391569 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.402993 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.412191 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.420749 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:05 crc kubenswrapper[4725]: I0225 10:55:05.435960 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:05Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.432923 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.433013 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.433039 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.433072 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.433093 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:06Z","lastTransitionTime":"2026-02-25T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:06 crc kubenswrapper[4725]: E0225 10:55:06.453768 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:06Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.458786 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.458963 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.459089 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.459220 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.459353 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:06Z","lastTransitionTime":"2026-02-25T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:06 crc kubenswrapper[4725]: E0225 10:55:06.477181 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:06Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.481667 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.481728 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.481746 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.481769 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.481787 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:06Z","lastTransitionTime":"2026-02-25T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:06 crc kubenswrapper[4725]: E0225 10:55:06.502628 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:06Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.508869 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.508966 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.508993 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.509028 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.509056 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:06Z","lastTransitionTime":"2026-02-25T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:06 crc kubenswrapper[4725]: E0225 10:55:06.529400 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:06Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.535329 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.535370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.535385 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.535408 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:06 crc kubenswrapper[4725]: I0225 10:55:06.535422 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:06Z","lastTransitionTime":"2026-02-25T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:06 crc kubenswrapper[4725]: E0225 10:55:06.551674 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:06Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:06 crc kubenswrapper[4725]: E0225 10:55:06.552007 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:55:07 crc kubenswrapper[4725]: I0225 10:55:07.223304 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:07 crc kubenswrapper[4725]: I0225 10:55:07.223317 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:07 crc kubenswrapper[4725]: I0225 10:55:07.223391 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:07 crc kubenswrapper[4725]: I0225 10:55:07.223424 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:07 crc kubenswrapper[4725]: E0225 10:55:07.223517 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:07 crc kubenswrapper[4725]: E0225 10:55:07.223678 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:07 crc kubenswrapper[4725]: E0225 10:55:07.223995 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:07 crc kubenswrapper[4725]: E0225 10:55:07.224079 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:08 crc kubenswrapper[4725]: I0225 10:55:08.224368 4725 scope.go:117] "RemoveContainer" containerID="5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899" Feb 25 10:55:08 crc kubenswrapper[4725]: E0225 10:55:08.224541 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:55:08 crc kubenswrapper[4725]: I0225 10:55:08.237918 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 25 10:55:09 crc kubenswrapper[4725]: I0225 10:55:09.224116 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:09 crc kubenswrapper[4725]: I0225 10:55:09.224231 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:09 crc kubenswrapper[4725]: I0225 10:55:09.224117 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:09 crc kubenswrapper[4725]: E0225 10:55:09.224346 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:09 crc kubenswrapper[4725]: I0225 10:55:09.224387 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:09 crc kubenswrapper[4725]: E0225 10:55:09.224542 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:09 crc kubenswrapper[4725]: E0225 10:55:09.224640 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:09 crc kubenswrapper[4725]: E0225 10:55:09.224847 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:10 crc kubenswrapper[4725]: E0225 10:55:10.332484 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:11 crc kubenswrapper[4725]: I0225 10:55:11.223402 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:11 crc kubenswrapper[4725]: I0225 10:55:11.223462 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:11 crc kubenswrapper[4725]: I0225 10:55:11.223491 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:11 crc kubenswrapper[4725]: I0225 10:55:11.223469 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:11 crc kubenswrapper[4725]: E0225 10:55:11.223582 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:11 crc kubenswrapper[4725]: E0225 10:55:11.223680 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:11 crc kubenswrapper[4725]: E0225 10:55:11.223753 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:11 crc kubenswrapper[4725]: E0225 10:55:11.223812 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:13 crc kubenswrapper[4725]: I0225 10:55:13.223887 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:13 crc kubenswrapper[4725]: I0225 10:55:13.225232 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:13 crc kubenswrapper[4725]: I0225 10:55:13.225291 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:13 crc kubenswrapper[4725]: E0225 10:55:13.225439 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:13 crc kubenswrapper[4725]: I0225 10:55:13.227112 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:13 crc kubenswrapper[4725]: E0225 10:55:13.227334 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:13 crc kubenswrapper[4725]: E0225 10:55:13.227470 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:13 crc kubenswrapper[4725]: E0225 10:55:13.227554 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:13 crc kubenswrapper[4725]: I0225 10:55:13.243464 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 25 10:55:14 crc kubenswrapper[4725]: I0225 10:55:14.937637 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/0.log" Feb 25 10:55:14 crc kubenswrapper[4725]: I0225 10:55:14.937690 4725 generic.go:334] "Generic (PLEG): container finished" podID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" containerID="5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147" exitCode=1 Feb 25 10:55:14 crc kubenswrapper[4725]: I0225 10:55:14.937722 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerDied","Data":"5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147"} Feb 25 10:55:14 crc kubenswrapper[4725]: I0225 10:55:14.938056 4725 scope.go:117] "RemoveContainer" containerID="5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147" Feb 25 10:55:14 crc kubenswrapper[4725]: I0225 10:55:14.957793 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:14Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:14 crc kubenswrapper[4725]: I0225 10:55:14.973013 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:14Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:14 crc kubenswrapper[4725]: I0225 10:55:14.986470 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:14Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.003057 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:14Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.038232 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.047741 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.047888 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.047953 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.047952 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:19.047926206 +0000 UTC m=+204.546508261 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.048054 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.048143 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:56:19.048064479 +0000 UTC m=+204.546646514 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.048175 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.048308 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:56:19.048293195 +0000 UTC m=+204.546875450 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.063160 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.079421 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.094337 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.109687 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.130117 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.143573 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.149211 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.149247 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.149291 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149393 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149506 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:56:19.149470535 +0000 UTC m=+204.648052620 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149421 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149550 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149563 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149580 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149643 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149663 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149616 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:56:19.149599218 +0000 UTC m=+204.648181303 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.149768 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:56:19.149750032 +0000 UTC m=+204.648332097 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.170334 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.182499 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.200251 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.211362 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.222087 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.223241 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.223270 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.223319 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.223235 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.223379 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.223458 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.223547 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.223613 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.236656 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.248326 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.272029 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.284661 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.300183 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.313417 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.326353 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: E0225 10:55:15.332924 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.341391 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.365856 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.380056 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.395442 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.408115 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.420497 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.434151 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.447434 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.475323 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.487308 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.500922 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.524491 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.544783 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.560394 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.572769 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.943755 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/0.log" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.943876 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerStarted","Data":"450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3"} Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.964925 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.982573 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:15 crc kubenswrapper[4725]: I0225 10:55:15.997933 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:15Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.018889 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.037203 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.056693 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.074422 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.089889 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.104335 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.121780 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.137655 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.151941 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.164642 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.175845 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.190139 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.213487 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.228596 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.243667 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.253753 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.792311 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.792343 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.792351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.792364 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.792373 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:16Z","lastTransitionTime":"2026-02-25T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:16 crc kubenswrapper[4725]: E0225 10:55:16.806574 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.810253 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.810294 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.810305 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.810320 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.810328 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:16Z","lastTransitionTime":"2026-02-25T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:16 crc kubenswrapper[4725]: E0225 10:55:16.831403 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.835630 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.835676 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.835687 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.835705 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.835717 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:16Z","lastTransitionTime":"2026-02-25T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:16 crc kubenswrapper[4725]: E0225 10:55:16.845689 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.849418 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.849458 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.849473 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.849490 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.849504 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:16Z","lastTransitionTime":"2026-02-25T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:16 crc kubenswrapper[4725]: E0225 10:55:16.863843 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.867338 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.867376 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.867389 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.867410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:16 crc kubenswrapper[4725]: I0225 10:55:16.867424 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:16Z","lastTransitionTime":"2026-02-25T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:16 crc kubenswrapper[4725]: E0225 10:55:16.879736 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:16Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:16 crc kubenswrapper[4725]: E0225 10:55:16.879971 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:55:17 crc kubenswrapper[4725]: I0225 10:55:17.223723 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:17 crc kubenswrapper[4725]: I0225 10:55:17.223794 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:17 crc kubenswrapper[4725]: I0225 10:55:17.223758 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:17 crc kubenswrapper[4725]: I0225 10:55:17.223723 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:17 crc kubenswrapper[4725]: E0225 10:55:17.223977 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:17 crc kubenswrapper[4725]: E0225 10:55:17.224084 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:17 crc kubenswrapper[4725]: E0225 10:55:17.224239 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:17 crc kubenswrapper[4725]: E0225 10:55:17.224340 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:19 crc kubenswrapper[4725]: I0225 10:55:19.223786 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:19 crc kubenswrapper[4725]: I0225 10:55:19.224115 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:19 crc kubenswrapper[4725]: E0225 10:55:19.224205 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:19 crc kubenswrapper[4725]: I0225 10:55:19.224036 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:19 crc kubenswrapper[4725]: I0225 10:55:19.224015 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:19 crc kubenswrapper[4725]: E0225 10:55:19.224308 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:19 crc kubenswrapper[4725]: E0225 10:55:19.224373 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:19 crc kubenswrapper[4725]: E0225 10:55:19.224531 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:20 crc kubenswrapper[4725]: E0225 10:55:20.334087 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:21 crc kubenswrapper[4725]: I0225 10:55:21.223461 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:21 crc kubenswrapper[4725]: I0225 10:55:21.223506 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:21 crc kubenswrapper[4725]: I0225 10:55:21.223512 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:21 crc kubenswrapper[4725]: I0225 10:55:21.223471 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:21 crc kubenswrapper[4725]: E0225 10:55:21.223573 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:21 crc kubenswrapper[4725]: E0225 10:55:21.223667 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:21 crc kubenswrapper[4725]: E0225 10:55:21.223742 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:21 crc kubenswrapper[4725]: E0225 10:55:21.224149 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:22 crc kubenswrapper[4725]: I0225 10:55:22.224343 4725 scope.go:117] "RemoveContainer" containerID="5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899" Feb 25 10:55:22 crc kubenswrapper[4725]: I0225 10:55:22.968468 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/2.log" Feb 25 10:55:22 crc kubenswrapper[4725]: I0225 10:55:22.971684 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} Feb 25 10:55:22 crc kubenswrapper[4725]: I0225 10:55:22.972370 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:55:22 crc kubenswrapper[4725]: I0225 10:55:22.984532 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:22Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.009656 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.026689 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.050774 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.070470 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.082034 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.094100 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.106622 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.120597 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.133525 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.148017 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.162551 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.178246 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.200894 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.216594 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.223869 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:23 crc kubenswrapper[4725]: E0225 10:55:23.223972 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.224120 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:23 crc kubenswrapper[4725]: E0225 10:55:23.224178 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.224300 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:23 crc kubenswrapper[4725]: E0225 10:55:23.224360 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.224483 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:23 crc kubenswrapper[4725]: E0225 10:55:23.224535 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.231822 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.249897 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.262582 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.275707 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:23Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.977721 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/3.log" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.978773 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/2.log" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.981769 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" exitCode=1 Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.981812 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.981895 4725 scope.go:117] "RemoveContainer" containerID="5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899" Feb 25 10:55:23 crc kubenswrapper[4725]: I0225 10:55:23.983156 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 10:55:23 crc kubenswrapper[4725]: E0225 10:55:23.983472 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.015943 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.031360 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.046011 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.057154 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.070995 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.087088 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.101277 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.122103 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a1895411f9d6df631bf83edeafcb45de30797d19a4426d1773adc6d120d6899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:54:53Z\\\",\\\"message\\\":\\\"roller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:54:53Z is after 2025-08-24T17:21:41Z]\\\\nI0225 10:54:53.132356 6983 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"54fbe873-7e6d-475f-a0ad-8dd5f06d850d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster\\\\\\\", UUI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:23Z\\\",\\\"message\\\":\\\"event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038421 7307 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038698 7307 ovnkube.go:599] Stopped ovnkube\\\\nI0225 10:55:23.038747 7307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0225 10:55:23.038841 7307 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.135114 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.153889 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.166804 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.180501 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.199628 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.214131 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.230552 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.245632 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.259467 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.273328 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.285640 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:24Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.988820 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/3.log" Feb 25 10:55:24 crc kubenswrapper[4725]: I0225 10:55:24.993714 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 10:55:24 crc kubenswrapper[4725]: E0225 10:55:24.993900 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.024726 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.036919 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.049937 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.062567 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.073618 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.084450 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.093214 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.109498 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:23Z\\\",\\\"message\\\":\\\"event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038421 7307 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038698 7307 ovnkube.go:599] Stopped ovnkube\\\\nI0225 10:55:23.038747 7307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0225 10:55:23.038841 7307 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:55:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.121236 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.132783 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.145904 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.159945 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.173681 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.189920 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.206417 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.218300 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.224052 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.224071 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.224094 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:25 crc kubenswrapper[4725]: E0225 10:55:25.224189 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:25 crc kubenswrapper[4725]: E0225 10:55:25.224235 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:25 crc kubenswrapper[4725]: E0225 10:55:25.224284 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.224295 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:25 crc kubenswrapper[4725]: E0225 10:55:25.224497 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.228627 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.241679 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.252075 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.260756 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.282633 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.294598 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.306672 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.327287 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:23Z\\\",\\\"message\\\":\\\"event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038421 7307 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038698 7307 ovnkube.go:599] Stopped ovnkube\\\\nI0225 10:55:23.038747 7307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0225 10:55:23.038841 7307 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:55:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: E0225 10:55:25.334440 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.343238 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.355702 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.366104 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.379141 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.395269 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.408446 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.421690 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.434624 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.448912 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.462352 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.471938 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.483726 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.496966 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:25 crc kubenswrapper[4725]: I0225 10:55:25.510103 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:25Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.897403 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.898017 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.898047 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.898081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.898100 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:26Z","lastTransitionTime":"2026-02-25T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:26 crc kubenswrapper[4725]: E0225 10:55:26.919407 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.923871 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.923907 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.923916 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.923931 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.923943 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:26Z","lastTransitionTime":"2026-02-25T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:26 crc kubenswrapper[4725]: E0225 10:55:26.941264 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.946884 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.946937 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.946955 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.946984 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.947003 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:26Z","lastTransitionTime":"2026-02-25T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:26 crc kubenswrapper[4725]: E0225 10:55:26.966041 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.978816 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.978914 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.978940 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.978973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.978995 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:26Z","lastTransitionTime":"2026-02-25T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:26 crc kubenswrapper[4725]: E0225 10:55:26.994011 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:26Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.999251 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.999343 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.999370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.999404 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:26 crc kubenswrapper[4725]: I0225 10:55:26.999428 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:26Z","lastTransitionTime":"2026-02-25T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:27 crc kubenswrapper[4725]: E0225 10:55:27.018146 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:27Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:27 crc kubenswrapper[4725]: E0225 10:55:27.018376 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:55:27 crc kubenswrapper[4725]: I0225 10:55:27.224152 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:27 crc kubenswrapper[4725]: I0225 10:55:27.224253 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:27 crc kubenswrapper[4725]: I0225 10:55:27.224350 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:27 crc kubenswrapper[4725]: I0225 10:55:27.224271 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:27 crc kubenswrapper[4725]: E0225 10:55:27.224480 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:27 crc kubenswrapper[4725]: E0225 10:55:27.224613 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:27 crc kubenswrapper[4725]: E0225 10:55:27.224800 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:27 crc kubenswrapper[4725]: E0225 10:55:27.224926 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:29 crc kubenswrapper[4725]: I0225 10:55:29.223245 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:29 crc kubenswrapper[4725]: I0225 10:55:29.223369 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:29 crc kubenswrapper[4725]: E0225 10:55:29.223418 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:29 crc kubenswrapper[4725]: I0225 10:55:29.223628 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:29 crc kubenswrapper[4725]: E0225 10:55:29.223637 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:29 crc kubenswrapper[4725]: I0225 10:55:29.223657 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:29 crc kubenswrapper[4725]: E0225 10:55:29.223723 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:29 crc kubenswrapper[4725]: E0225 10:55:29.223915 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:30 crc kubenswrapper[4725]: E0225 10:55:30.336359 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:31 crc kubenswrapper[4725]: I0225 10:55:31.224059 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:31 crc kubenswrapper[4725]: I0225 10:55:31.224169 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:31 crc kubenswrapper[4725]: I0225 10:55:31.224221 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:31 crc kubenswrapper[4725]: I0225 10:55:31.224257 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:31 crc kubenswrapper[4725]: E0225 10:55:31.224474 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:31 crc kubenswrapper[4725]: E0225 10:55:31.224594 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:31 crc kubenswrapper[4725]: E0225 10:55:31.224746 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:31 crc kubenswrapper[4725]: E0225 10:55:31.224943 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:33 crc kubenswrapper[4725]: I0225 10:55:33.223333 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:33 crc kubenswrapper[4725]: I0225 10:55:33.223395 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:33 crc kubenswrapper[4725]: I0225 10:55:33.223421 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:33 crc kubenswrapper[4725]: E0225 10:55:33.223483 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:33 crc kubenswrapper[4725]: I0225 10:55:33.223534 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:33 crc kubenswrapper[4725]: E0225 10:55:33.223913 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:33 crc kubenswrapper[4725]: E0225 10:55:33.224021 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:33 crc kubenswrapper[4725]: E0225 10:55:33.224114 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.223469 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.223553 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:35 crc kubenswrapper[4725]: E0225 10:55:35.223715 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.223916 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:35 crc kubenswrapper[4725]: E0225 10:55:35.224027 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.223969 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:35 crc kubenswrapper[4725]: E0225 10:55:35.224146 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:35 crc kubenswrapper[4725]: E0225 10:55:35.224284 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.244158 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fdc713b9e399b8a1bab7683da3d24c13b7da0d79e6257b6804da8ab945dc76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.257041 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.272407 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fd4a582-ec8c-4d92-af5f-9cda0a573098\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 10:53:58.929429 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 10:53:58.929596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 10:53:58.930561 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4289430008/tls.crt::/tmp/serving-cert-4289430008/tls.key\\\\\\\"\\\\nI0225 10:53:59.157399 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 10:53:59.162049 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 10:53:59.162065 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 10:53:59.162085 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 10:53:59.162091 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 10:53:59.167607 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 10:53:59.167625 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167630 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 10:53:59.167635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 10:53:59.167639 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 10:53:59.167642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 10:53:59.167645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 10:53:59.167781 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 10:53:59.169984 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:53:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.289230 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41685679-158d-45eb-8ff1-0634a2e216b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5af130cfc3ec4d1d59fb86f97750dcfce452a9420869da24c61e36692fb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca2dbaf5e51dd7999444ca4782ba69fc970d3482c42eae3a5213ea46fca989a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c32b6bbeed61bd1f5e61561caef8574feb2103c29e00740fb2204fc4d957edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e723ea5ff4feee5b8fd93cef5a90f04e52b0ce52aa2674bc6d9c574344d285\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.303213 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b71d53fa-8177-4689-95cc-58ce940cd291\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa205676c6a90a6d12d2cadf35b0ff757c3f827f9f47b08972a83e26a6277a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4ed7b5736e8db99b3fdfd8a852aa283e8f2e720c0d74b7e215baf65de06ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.320173 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8691d03226e158e9e6e975e7242999867d60af1ed9b5082352b564973b2f958e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39badbdf128906cac1435a9c55680603c6d24f92914ac350d9d28fcfd4641720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.332664 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7k279" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708f426f-f477-476b-92eb-7ab94a133335\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7lwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7k279\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: E0225 10:55:35.337192 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.354524 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86687c37e57a83910c27f3e6f5b31ade62d980ee7da55e38b8b888c16107ba58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.376534 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.391552 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.412219 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6b9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb276f6-5e43-4b04-a290-42bfdc3b1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:13Z\\\",\\\"message\\\":\\\"2026-02-25T10:54:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da\\\\n2026-02-25T10:54:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_192dfadf-dc16-4000-a270-41b12f49b4da to /host/opt/cni/bin/\\\\n2026-02-25T10:54:28Z [verbose] multus-daemon started\\\\n2026-02-25T10:54:28Z [verbose] Readiness Indicator file check\\\\n2026-02-25T10:55:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwml6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6b9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.444728 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b600b8a3-eb62-43ed-96ae-798a7180f3d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aab5b061faf21d855cd28a19e42234978cc36a6cbdd769258b326e4c6d1decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73db166174a919e3d2212d1245746ef1e5162c778ac20efa956d0501a74ed17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bb44018e70c59c021990ce06397cf0cd1afbcebd43221dc30c6d03da721233e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d30a09435b429714a7be6dee53b12d181874cb4e7c282819a69b488dd74493e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45915492e41585bf63d5bae4bda83778edd6d4a2d9642ef54613750f7c1507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4e2af58ad48dee130770ae558fe8f05f2e6e2572360551ac9ff866a626c45af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fad3c6ed615b176159f2aca135db07a2dbf5dea57c89eb5a82e5ff4898f1d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e56405403a5f521a169fff5bfcd8b942c618ce2db177fe65eef420505d5d2953\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.464268 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45281192-e0fb-4146-9356-8b9f873e137c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:52:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbd2d5a02c242c6829dd2d2ec56b8ef3e438e471dd7663b9d6ae562f27f41b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c66c3d7690d321a0e544fe8b52136484bad5957078e52b3cd5b2af19f65dcd1d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T10:53:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 10:52:57.442052 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 10:52:57.448275 1 observer_polling.go:159] Starting file observer\\\\nI0225 10:52:57.498803 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 10:52:57.504379 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 10:53:27.797595 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5b96cc5e54f4954b94834679233c8450958f0fd987daae0868930fa0634228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a0b77b48cd862d75acc79320f184b8531c1b26505aa85328f3b6c275701ea3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:52:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.489563 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8877-1961-407f-b4a7-66e55321a6eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4e1bf5ac0dc03478017c1651a2e2e6b67b9c5cd20f56bc32aba8fae35f0a17b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://585a7909417b9dfea9d289ef6a164e65b0a071834ddeceaa8bc4b3c4dc1849bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://629f100776c37c8f8374d98843ab7ac532eaef291014dfb46cfc17cceedcbc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bce703c178d81777d705ae362cef1f07ff255e26ec8e02f2d5d7be2408c08a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23f1d884aec92fde9b0be17c645062f5f47dcab8b4c8ed7e936b1e5c116de0d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a3b0929c89e5c88ba09f90c1bffe39c1e8f8012d55532fa0666c69dbb299609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ca2cbd9e63cb604a63e511043839c3a32542efbbc67f1ba91c8323aabcb8a2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r45dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9mhzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.501384 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9989l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de69f49-3e33-4721-9fee-ad2fc45b16bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289109aefd9715cbbb0c4a8313114a0aaefed9a4f3415b4523c5ace0234f7cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnp2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9989l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.515040 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f769618-965f-430a-8f67-e1ef4d94a063\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bdec2f2be230ffa2f415535414e261c6cb14dd1494472010d016bc0617446b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1136b678f152877870494f8279b7e4610d9538695e2bcad634f831c4c4ad4417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7ngx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rtvsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.529009 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4742f60-e555-4f96-be12-b9e46a857bd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d59c7ff507cf1804a4e43bcd036bdaa13a8363ddf89418a6f3d60c6b6e678205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-256sf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.540882 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8zw9d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4a262bc-bc77-471f-91d7-58fb221fa404\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf72d0cea2c95f14abfcddc22f7590a600b5ffb94bfacf2a39a085ab26c554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvjr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8zw9d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:35 crc kubenswrapper[4725]: I0225 10:55:35.565501 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07a39624-e0d8-44dc-9596-cd7224f58d5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T10:54:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T10:55:23Z\\\",\\\"message\\\":\\\"event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038421 7307 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 10:55:23.038698 7307 ovnkube.go:599] Stopped ovnkube\\\\nI0225 10:55:23.038747 7307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0225 10:55:23.038841 7307 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T10:55:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T10:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T10:54:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T10:54:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hct4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T10:54:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6klc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:35Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.224595 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.224627 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.224684 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.224677 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.225976 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.226222 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.226338 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.226471 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.393818 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.394264 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.394351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.394452 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.394564 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:37Z","lastTransitionTime":"2026-02-25T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.407905 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.412174 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.412212 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.412222 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.412238 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.412249 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:37Z","lastTransitionTime":"2026-02-25T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.427464 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.432508 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.432574 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.432586 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.432608 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.432622 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:37Z","lastTransitionTime":"2026-02-25T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.446804 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.451108 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.451158 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.451170 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.451192 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.451205 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:37Z","lastTransitionTime":"2026-02-25T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.491147 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.495892 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.495932 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.495941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.495955 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:37 crc kubenswrapper[4725]: I0225 10:55:37.495965 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:37Z","lastTransitionTime":"2026-02-25T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.510268 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T10:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a6d2d14d-afd1-48db-8d7e-cf300f526a2d\\\",\\\"systemUUID\\\":\\\"aee608f3-29ba-451f-a6f1-6eeae4d0f001\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T10:55:37Z is after 2025-08-24T17:21:41Z" Feb 25 10:55:37 crc kubenswrapper[4725]: E0225 10:55:37.510403 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 10:55:39 crc kubenswrapper[4725]: I0225 10:55:39.223678 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:39 crc kubenswrapper[4725]: E0225 10:55:39.223810 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:39 crc kubenswrapper[4725]: I0225 10:55:39.223696 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:39 crc kubenswrapper[4725]: E0225 10:55:39.223919 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:39 crc kubenswrapper[4725]: I0225 10:55:39.223912 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:39 crc kubenswrapper[4725]: I0225 10:55:39.223965 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:39 crc kubenswrapper[4725]: E0225 10:55:39.224807 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:39 crc kubenswrapper[4725]: E0225 10:55:39.224989 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:39 crc kubenswrapper[4725]: I0225 10:55:39.225393 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 10:55:39 crc kubenswrapper[4725]: E0225 10:55:39.225689 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:55:40 crc kubenswrapper[4725]: E0225 10:55:40.339023 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:41 crc kubenswrapper[4725]: I0225 10:55:41.224289 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:41 crc kubenswrapper[4725]: E0225 10:55:41.224472 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:41 crc kubenswrapper[4725]: I0225 10:55:41.225617 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:41 crc kubenswrapper[4725]: E0225 10:55:41.225731 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:41 crc kubenswrapper[4725]: I0225 10:55:41.225791 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:41 crc kubenswrapper[4725]: E0225 10:55:41.225917 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:41 crc kubenswrapper[4725]: I0225 10:55:41.226082 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:41 crc kubenswrapper[4725]: E0225 10:55:41.226310 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:43 crc kubenswrapper[4725]: I0225 10:55:43.223937 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:43 crc kubenswrapper[4725]: I0225 10:55:43.223939 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:43 crc kubenswrapper[4725]: I0225 10:55:43.223962 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:43 crc kubenswrapper[4725]: I0225 10:55:43.224055 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:43 crc kubenswrapper[4725]: E0225 10:55:43.224254 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:43 crc kubenswrapper[4725]: E0225 10:55:43.224431 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:43 crc kubenswrapper[4725]: E0225 10:55:43.224568 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:43 crc kubenswrapper[4725]: E0225 10:55:43.224604 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.223683 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.223751 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:45 crc kubenswrapper[4725]: E0225 10:55:45.224025 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.224084 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.224109 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:45 crc kubenswrapper[4725]: E0225 10:55:45.224179 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:45 crc kubenswrapper[4725]: E0225 10:55:45.224303 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:45 crc kubenswrapper[4725]: E0225 10:55:45.224469 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.301561 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d6b9f" podStartSLOduration=116.301528406 podStartE2EDuration="1m56.301528406s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.286915721 +0000 UTC m=+170.785497766" watchObservedRunningTime="2026-02-25 10:55:45.301528406 +0000 UTC m=+170.800110431" Feb 25 10:55:45 crc kubenswrapper[4725]: E0225 10:55:45.339649 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.359011 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=54.358990343 podStartE2EDuration="54.358990343s" podCreationTimestamp="2026-02-25 10:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.342102429 +0000 UTC m=+170.840684474" watchObservedRunningTime="2026-02-25 10:55:45.358990343 +0000 UTC m=+170.857572368" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.370533 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9mhzp" podStartSLOduration=116.370515049 podStartE2EDuration="1m56.370515049s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.359381793 +0000 UTC m=+170.857963818" watchObservedRunningTime="2026-02-25 10:55:45.370515049 +0000 UTC m=+170.869097074" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.415047 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.415025772 podStartE2EDuration="1m7.415025772s" podCreationTimestamp="2026-02-25 10:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.413049612 +0000 UTC m=+170.911631687" watchObservedRunningTime="2026-02-25 10:55:45.415025772 +0000 UTC m=+170.913607817" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.415577 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9989l" podStartSLOduration=116.415566186 podStartE2EDuration="1m56.415566186s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.371128494 +0000 UTC m=+170.869710519" watchObservedRunningTime="2026-02-25 10:55:45.415566186 +0000 UTC m=+170.914148251" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.439643 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podStartSLOduration=116.439617784 podStartE2EDuration="1m56.439617784s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.439534112 +0000 UTC m=+170.938116137" watchObservedRunningTime="2026-02-25 10:55:45.439617784 +0000 UTC m=+170.938199819" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.464199 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8zw9d" podStartSLOduration=116.464176955 podStartE2EDuration="1m56.464176955s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.464135114 +0000 UTC m=+170.962717149" watchObservedRunningTime="2026-02-25 10:55:45.464176955 +0000 UTC m=+170.962758980" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.514963 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rtvsj" podStartSLOduration=116.51494439 podStartE2EDuration="1m56.51494439s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.51418898 +0000 UTC m=+171.012771005" watchObservedRunningTime="2026-02-25 10:55:45.51494439 +0000 UTC m=+171.013526415" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.539538 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=32.539521781 podStartE2EDuration="32.539521781s" podCreationTimestamp="2026-02-25 10:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.529238737 +0000 UTC m=+171.027820762" watchObservedRunningTime="2026-02-25 10:55:45.539521781 +0000 UTC m=+171.038103806" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.550515 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=37.550502593 podStartE2EDuration="37.550502593s" podCreationTimestamp="2026-02-25 10:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.53947305 +0000 UTC m=+171.038055085" watchObservedRunningTime="2026-02-25 10:55:45.550502593 +0000 UTC m=+171.049084618" Feb 25 10:55:45 crc kubenswrapper[4725]: I0225 10:55:45.595820 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.595798887 podStartE2EDuration="1m27.595798887s" podCreationTimestamp="2026-02-25 10:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:45.595591092 +0000 UTC m=+171.094173137" watchObservedRunningTime="2026-02-25 10:55:45.595798887 +0000 UTC m=+171.094380912" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.224107 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.224272 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:47 crc kubenswrapper[4725]: E0225 10:55:47.224362 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.224412 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.224412 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:47 crc kubenswrapper[4725]: E0225 10:55:47.224538 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:47 crc kubenswrapper[4725]: E0225 10:55:47.224708 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:47 crc kubenswrapper[4725]: E0225 10:55:47.224821 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.517662 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.517727 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.517751 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.517781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.517805 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T10:55:47Z","lastTransitionTime":"2026-02-25T10:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.570934 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q"] Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.571305 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.573261 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.573347 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.574027 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.574099 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.712449 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b91e060-cd53-483a-a15c-04c008215bcd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.712579 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b91e060-cd53-483a-a15c-04c008215bcd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.712626 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b91e060-cd53-483a-a15c-04c008215bcd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.712665 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b91e060-cd53-483a-a15c-04c008215bcd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.712712 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b91e060-cd53-483a-a15c-04c008215bcd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.814685 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b91e060-cd53-483a-a15c-04c008215bcd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.814768 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b91e060-cd53-483a-a15c-04c008215bcd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.814856 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b91e060-cd53-483a-a15c-04c008215bcd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.814883 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b91e060-cd53-483a-a15c-04c008215bcd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.814909 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b91e060-cd53-483a-a15c-04c008215bcd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.814994 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b91e060-cd53-483a-a15c-04c008215bcd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.815029 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b91e060-cd53-483a-a15c-04c008215bcd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.816488 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b91e060-cd53-483a-a15c-04c008215bcd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.824903 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b91e060-cd53-483a-a15c-04c008215bcd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.831449 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b91e060-cd53-483a-a15c-04c008215bcd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-46v5q\" (UID: \"6b91e060-cd53-483a-a15c-04c008215bcd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: I0225 10:55:47.895761 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" Feb 25 10:55:47 crc kubenswrapper[4725]: W0225 10:55:47.918457 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b91e060_cd53_483a_a15c_04c008215bcd.slice/crio-2e60e4a5f990b5076d154c333acf1ff6cc295268fc04ab9764157c5b6b616347 WatchSource:0}: Error finding container 2e60e4a5f990b5076d154c333acf1ff6cc295268fc04ab9764157c5b6b616347: Status 404 returned error can't find the container with id 2e60e4a5f990b5076d154c333acf1ff6cc295268fc04ab9764157c5b6b616347 Feb 25 10:55:48 crc kubenswrapper[4725]: I0225 10:55:48.071347 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" event={"ID":"6b91e060-cd53-483a-a15c-04c008215bcd","Type":"ContainerStarted","Data":"eb3e29571a94f574142b3d459fc4326368c0a54e4a9157f6de7e159492868f06"} Feb 25 10:55:48 crc kubenswrapper[4725]: I0225 10:55:48.071406 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" event={"ID":"6b91e060-cd53-483a-a15c-04c008215bcd","Type":"ContainerStarted","Data":"2e60e4a5f990b5076d154c333acf1ff6cc295268fc04ab9764157c5b6b616347"} Feb 25 10:55:48 crc kubenswrapper[4725]: I0225 10:55:48.087776 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46v5q" podStartSLOduration=119.087754147 podStartE2EDuration="1m59.087754147s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:55:48.087433549 +0000 UTC m=+173.586015574" watchObservedRunningTime="2026-02-25 10:55:48.087754147 +0000 UTC m=+173.586336182" Feb 25 10:55:48 crc kubenswrapper[4725]: I0225 10:55:48.255638 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 25 10:55:48 crc kubenswrapper[4725]: I0225 10:55:48.263682 4725 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 25 10:55:49 crc kubenswrapper[4725]: I0225 10:55:49.223655 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:49 crc kubenswrapper[4725]: I0225 10:55:49.223691 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:49 crc kubenswrapper[4725]: E0225 10:55:49.223809 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:49 crc kubenswrapper[4725]: I0225 10:55:49.223898 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:49 crc kubenswrapper[4725]: E0225 10:55:49.223994 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:49 crc kubenswrapper[4725]: E0225 10:55:49.224052 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:49 crc kubenswrapper[4725]: I0225 10:55:49.224351 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:49 crc kubenswrapper[4725]: E0225 10:55:49.224595 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:50 crc kubenswrapper[4725]: E0225 10:55:50.340606 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:51 crc kubenswrapper[4725]: I0225 10:55:51.224171 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:51 crc kubenswrapper[4725]: I0225 10:55:51.224207 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:51 crc kubenswrapper[4725]: I0225 10:55:51.224194 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:51 crc kubenswrapper[4725]: I0225 10:55:51.224171 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:51 crc kubenswrapper[4725]: E0225 10:55:51.224304 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:51 crc kubenswrapper[4725]: E0225 10:55:51.224376 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:51 crc kubenswrapper[4725]: E0225 10:55:51.224445 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:51 crc kubenswrapper[4725]: E0225 10:55:51.224499 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:52 crc kubenswrapper[4725]: I0225 10:55:52.224021 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 10:55:52 crc kubenswrapper[4725]: E0225 10:55:52.224234 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6klc9_openshift-ovn-kubernetes(07a39624-e0d8-44dc-9596-cd7224f58d5d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" Feb 25 10:55:53 crc kubenswrapper[4725]: I0225 10:55:53.223564 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:53 crc kubenswrapper[4725]: I0225 10:55:53.223606 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:53 crc kubenswrapper[4725]: I0225 10:55:53.223581 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:53 crc kubenswrapper[4725]: E0225 10:55:53.223691 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:53 crc kubenswrapper[4725]: E0225 10:55:53.223919 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:53 crc kubenswrapper[4725]: I0225 10:55:53.223939 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:53 crc kubenswrapper[4725]: E0225 10:55:53.224099 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:53 crc kubenswrapper[4725]: E0225 10:55:53.224171 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:55 crc kubenswrapper[4725]: I0225 10:55:55.224104 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:55 crc kubenswrapper[4725]: I0225 10:55:55.224146 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:55 crc kubenswrapper[4725]: I0225 10:55:55.224113 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:55 crc kubenswrapper[4725]: E0225 10:55:55.224205 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:55 crc kubenswrapper[4725]: I0225 10:55:55.224109 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:55 crc kubenswrapper[4725]: E0225 10:55:55.224331 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:55 crc kubenswrapper[4725]: E0225 10:55:55.224452 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:55 crc kubenswrapper[4725]: E0225 10:55:55.224588 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:55 crc kubenswrapper[4725]: E0225 10:55:55.341340 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:55:57 crc kubenswrapper[4725]: I0225 10:55:57.223932 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:57 crc kubenswrapper[4725]: I0225 10:55:57.224009 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:57 crc kubenswrapper[4725]: I0225 10:55:57.224108 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:57 crc kubenswrapper[4725]: E0225 10:55:57.224097 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:57 crc kubenswrapper[4725]: I0225 10:55:57.224127 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:57 crc kubenswrapper[4725]: E0225 10:55:57.224370 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:57 crc kubenswrapper[4725]: E0225 10:55:57.224532 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:55:57 crc kubenswrapper[4725]: E0225 10:55:57.224607 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:59 crc kubenswrapper[4725]: I0225 10:55:59.224244 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:55:59 crc kubenswrapper[4725]: I0225 10:55:59.224377 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:55:59 crc kubenswrapper[4725]: I0225 10:55:59.224281 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:55:59 crc kubenswrapper[4725]: I0225 10:55:59.224439 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:55:59 crc kubenswrapper[4725]: E0225 10:55:59.225131 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:55:59 crc kubenswrapper[4725]: E0225 10:55:59.225296 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:55:59 crc kubenswrapper[4725]: E0225 10:55:59.225432 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:55:59 crc kubenswrapper[4725]: E0225 10:55:59.225569 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:00 crc kubenswrapper[4725]: E0225 10:56:00.342813 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.113110 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/1.log" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.113697 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/0.log" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.113726 4725 generic.go:334] "Generic (PLEG): container finished" podID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" containerID="450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3" exitCode=1 Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.113752 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerDied","Data":"450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3"} Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.113784 4725 scope.go:117] "RemoveContainer" containerID="5f0e3ac5242f1aa83b00eab23290e7e8bb4b3061693efa3cafd1cef47e4f9147" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.115293 4725 scope.go:117] "RemoveContainer" containerID="450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3" Feb 25 10:56:01 crc kubenswrapper[4725]: E0225 10:56:01.117133 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-d6b9f_openshift-multus(7fb276f6-5e43-4b04-a290-42bfdc3b1125)\"" pod="openshift-multus/multus-d6b9f" podUID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.224266 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.224723 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:01 crc kubenswrapper[4725]: E0225 10:56:01.225408 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.225110 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:01 crc kubenswrapper[4725]: E0225 10:56:01.225591 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:01 crc kubenswrapper[4725]: I0225 10:56:01.225202 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:01 crc kubenswrapper[4725]: E0225 10:56:01.225752 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:01 crc kubenswrapper[4725]: E0225 10:56:01.224953 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:02 crc kubenswrapper[4725]: I0225 10:56:02.117219 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/1.log" Feb 25 10:56:03 crc kubenswrapper[4725]: I0225 10:56:03.223241 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:03 crc kubenswrapper[4725]: I0225 10:56:03.223384 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:03 crc kubenswrapper[4725]: I0225 10:56:03.223479 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:03 crc kubenswrapper[4725]: E0225 10:56:03.223479 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:03 crc kubenswrapper[4725]: I0225 10:56:03.223705 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:03 crc kubenswrapper[4725]: E0225 10:56:03.223799 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:03 crc kubenswrapper[4725]: E0225 10:56:03.223637 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:03 crc kubenswrapper[4725]: E0225 10:56:03.223950 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:05 crc kubenswrapper[4725]: I0225 10:56:05.223348 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:05 crc kubenswrapper[4725]: I0225 10:56:05.223426 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:05 crc kubenswrapper[4725]: I0225 10:56:05.223426 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:05 crc kubenswrapper[4725]: I0225 10:56:05.225891 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:05 crc kubenswrapper[4725]: E0225 10:56:05.225899 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:05 crc kubenswrapper[4725]: E0225 10:56:05.226040 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:05 crc kubenswrapper[4725]: E0225 10:56:05.226093 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:05 crc kubenswrapper[4725]: E0225 10:56:05.226243 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:05 crc kubenswrapper[4725]: E0225 10:56:05.343455 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:56:07 crc kubenswrapper[4725]: I0225 10:56:07.223992 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:07 crc kubenswrapper[4725]: I0225 10:56:07.224069 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:07 crc kubenswrapper[4725]: I0225 10:56:07.224020 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:07 crc kubenswrapper[4725]: E0225 10:56:07.224326 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:07 crc kubenswrapper[4725]: I0225 10:56:07.224889 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:07 crc kubenswrapper[4725]: E0225 10:56:07.225031 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:07 crc kubenswrapper[4725]: E0225 10:56:07.225805 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:07 crc kubenswrapper[4725]: E0225 10:56:07.225952 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:07 crc kubenswrapper[4725]: I0225 10:56:07.226314 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 10:56:08 crc kubenswrapper[4725]: I0225 10:56:08.088348 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7k279"] Feb 25 10:56:08 crc kubenswrapper[4725]: I0225 10:56:08.143349 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/3.log" Feb 25 10:56:08 crc kubenswrapper[4725]: I0225 10:56:08.146125 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerStarted","Data":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} Feb 25 10:56:08 crc kubenswrapper[4725]: I0225 10:56:08.146205 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:08 crc kubenswrapper[4725]: E0225 10:56:08.146349 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:08 crc kubenswrapper[4725]: I0225 10:56:08.146996 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:56:08 crc kubenswrapper[4725]: I0225 10:56:08.177897 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podStartSLOduration=139.177874916 podStartE2EDuration="2m19.177874916s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:08.176368358 +0000 UTC m=+193.674950413" watchObservedRunningTime="2026-02-25 10:56:08.177874916 +0000 UTC m=+193.676456951" Feb 25 10:56:09 crc kubenswrapper[4725]: I0225 10:56:09.223964 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:09 crc kubenswrapper[4725]: I0225 10:56:09.224034 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:09 crc kubenswrapper[4725]: I0225 10:56:09.224064 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:09 crc kubenswrapper[4725]: E0225 10:56:09.224141 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:09 crc kubenswrapper[4725]: E0225 10:56:09.224294 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:09 crc kubenswrapper[4725]: E0225 10:56:09.224454 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:10 crc kubenswrapper[4725]: I0225 10:56:10.223675 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:10 crc kubenswrapper[4725]: E0225 10:56:10.223936 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:10 crc kubenswrapper[4725]: E0225 10:56:10.345313 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:56:11 crc kubenswrapper[4725]: I0225 10:56:11.224060 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:11 crc kubenswrapper[4725]: I0225 10:56:11.224065 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:11 crc kubenswrapper[4725]: E0225 10:56:11.224262 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:11 crc kubenswrapper[4725]: I0225 10:56:11.224354 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:11 crc kubenswrapper[4725]: E0225 10:56:11.224504 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:11 crc kubenswrapper[4725]: E0225 10:56:11.224601 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:12 crc kubenswrapper[4725]: I0225 10:56:12.223604 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:12 crc kubenswrapper[4725]: E0225 10:56:12.223887 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:13 crc kubenswrapper[4725]: I0225 10:56:13.223867 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:13 crc kubenswrapper[4725]: E0225 10:56:13.224023 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:13 crc kubenswrapper[4725]: I0225 10:56:13.223879 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:13 crc kubenswrapper[4725]: I0225 10:56:13.224077 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:13 crc kubenswrapper[4725]: E0225 10:56:13.224299 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:13 crc kubenswrapper[4725]: E0225 10:56:13.224511 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:14 crc kubenswrapper[4725]: I0225 10:56:14.223768 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:14 crc kubenswrapper[4725]: E0225 10:56:14.224046 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:15 crc kubenswrapper[4725]: I0225 10:56:15.224163 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:15 crc kubenswrapper[4725]: I0225 10:56:15.224214 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:15 crc kubenswrapper[4725]: E0225 10:56:15.226203 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:15 crc kubenswrapper[4725]: I0225 10:56:15.226336 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:15 crc kubenswrapper[4725]: I0225 10:56:15.226980 4725 scope.go:117] "RemoveContainer" containerID="450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3" Feb 25 10:56:15 crc kubenswrapper[4725]: E0225 10:56:15.227250 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:15 crc kubenswrapper[4725]: E0225 10:56:15.227564 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:15 crc kubenswrapper[4725]: E0225 10:56:15.346420 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 10:56:16 crc kubenswrapper[4725]: I0225 10:56:16.178872 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/1.log" Feb 25 10:56:16 crc kubenswrapper[4725]: I0225 10:56:16.179283 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerStarted","Data":"e36f678444c7f8932a1272a93dd2c22ee7a9de5680524aba427e492321e3c745"} Feb 25 10:56:16 crc kubenswrapper[4725]: I0225 10:56:16.223543 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:16 crc kubenswrapper[4725]: E0225 10:56:16.223780 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:17 crc kubenswrapper[4725]: I0225 10:56:17.223930 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:17 crc kubenswrapper[4725]: I0225 10:56:17.223941 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:17 crc kubenswrapper[4725]: I0225 10:56:17.224010 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:17 crc kubenswrapper[4725]: E0225 10:56:17.224818 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:17 crc kubenswrapper[4725]: E0225 10:56:17.224979 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:17 crc kubenswrapper[4725]: E0225 10:56:17.225120 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:18 crc kubenswrapper[4725]: I0225 10:56:18.224215 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:18 crc kubenswrapper[4725]: E0225 10:56:18.224446 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.051680 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.051912 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:58:21.051818614 +0000 UTC m=+326.550400669 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.051978 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.052082 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.052178 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.052204 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.052254 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:58:21.052234925 +0000 UTC m=+326.550816990 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.052279 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 10:58:21.052267426 +0000 UTC m=+326.550849481 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.153540 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.153614 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.153707 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.153769 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.153952 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.153984 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs podName:708f426f-f477-476b-92eb-7ab94a133335 nodeName:}" failed. No retries permitted until 2026-02-25 10:58:21.153947378 +0000 UTC m=+326.652529483 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs") pod "network-metrics-daemon-7k279" (UID: "708f426f-f477-476b-92eb-7ab94a133335") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.153995 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.154011 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.154024 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.154059 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.154139 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.154109 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 10:58:21.154086571 +0000 UTC m=+326.652668696 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.154357 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 10:58:21.154292047 +0000 UTC m=+326.652874152 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.223788 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.224073 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.224432 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.224596 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:56:19 crc kubenswrapper[4725]: I0225 10:56:19.224696 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:19 crc kubenswrapper[4725]: E0225 10:56:19.225004 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 10:56:20 crc kubenswrapper[4725]: I0225 10:56:20.224077 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:20 crc kubenswrapper[4725]: E0225 10:56:20.224287 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7k279" podUID="708f426f-f477-476b-92eb-7ab94a133335" Feb 25 10:56:21 crc kubenswrapper[4725]: I0225 10:56:21.224159 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:56:21 crc kubenswrapper[4725]: I0225 10:56:21.224216 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:56:21 crc kubenswrapper[4725]: I0225 10:56:21.224279 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:56:21 crc kubenswrapper[4725]: I0225 10:56:21.227974 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 25 10:56:21 crc kubenswrapper[4725]: I0225 10:56:21.228385 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 25 10:56:21 crc kubenswrapper[4725]: I0225 10:56:21.229276 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 25 10:56:21 crc kubenswrapper[4725]: I0225 10:56:21.230300 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 25 10:56:22 crc kubenswrapper[4725]: I0225 10:56:22.223379 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:56:22 crc kubenswrapper[4725]: I0225 10:56:22.225789 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 25 10:56:22 crc kubenswrapper[4725]: I0225 10:56:22.228746 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.547247 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.599791 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.600824 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.603541 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.604441 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.604497 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.604465 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.608212 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.608312 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.608337 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.608387 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.608968 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.613921 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.614900 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.615479 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.615525 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.617282 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.617776 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9p4cm"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.617985 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.618315 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.618350 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.618413 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9p4cm" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.620536 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.620764 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.620983 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.621399 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.621531 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.621661 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.621936 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpmr4"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.622821 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.627321 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5mvj"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.627338 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.628250 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.628265 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.628284 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.628453 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.628574 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.628744 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.630270 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.632810 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-f4l29"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.633490 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.639669 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vtck6"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.640494 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.642345 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.642600 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.642969 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.643196 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.643611 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.643724 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.650791 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.651092 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.651738 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.652039 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.652218 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.652491 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.654093 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.654605 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.655036 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.655635 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.655731 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.657286 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.657718 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.657380 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.672067 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.672380 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.672424 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.672621 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.672677 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.672781 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.672942 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.673000 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.673062 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7lb6x"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.673229 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674657 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-serving-cert\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674695 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a248ae7c-6e03-4e10-bdd5-ef7e31335976-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674724 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-config\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674743 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-audit-policies\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674764 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-encryption-config\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674785 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-service-ca\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674806 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-oauth-config\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674844 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a248ae7c-6e03-4e10-bdd5-ef7e31335976-serving-cert\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674875 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-audit-dir\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674911 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htpcv\" (UniqueName: \"kubernetes.io/projected/a248ae7c-6e03-4e10-bdd5-ef7e31335976-kube-api-access-htpcv\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674932 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-trusted-ca-bundle\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674953 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-serving-cert\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674976 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.674996 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cd8x\" (UniqueName: \"kubernetes.io/projected/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-kube-api-access-9cd8x\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.675016 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zkzx\" (UniqueName: \"kubernetes.io/projected/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-kube-api-access-9zkzx\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.675038 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-etcd-client\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.675071 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.675093 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-oauth-serving-cert\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.675013 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.675684 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.676246 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.676739 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.677048 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mw7b2"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.677382 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.677921 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.678822 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.682546 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.683056 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.685067 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.685250 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.685432 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.685866 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686211 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686241 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686416 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686573 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686657 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686788 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686931 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.686987 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.687881 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.688135 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.688377 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.688552 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.689651 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.690056 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.690184 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.690690 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.690845 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.691901 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.692462 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.692471 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.692557 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.692654 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.692791 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.693674 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.693784 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7624"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.694379 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.694499 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-q77vl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.694680 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.694504 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.695800 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.705218 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.705580 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.705769 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.705932 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.706056 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.706184 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.706231 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.706280 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.705784 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.706741 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.706780 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.707033 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.707999 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.708308 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.708647 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.725174 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.726207 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.727031 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5p82k"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.727257 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.727573 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.727813 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.730097 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.730408 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.731705 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.732001 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.732248 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.733439 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.733777 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.734150 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.734287 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6trwd"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.735229 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qsb7p"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.736421 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.737791 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.737866 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.742012 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.753293 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.755995 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.756623 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.761237 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nrlgl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.762104 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vkmbp"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.762515 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.762918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.763262 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.764089 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.764459 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.769428 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.770070 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.773806 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.774761 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775165 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533616-zsh9g"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775614 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775696 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-bound-sa-token\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775735 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c330367-5495-4729-85ef-4ff602ab6808-trusted-ca\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775760 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrd5l\" (UniqueName: \"kubernetes.io/projected/134ee45f-ab84-4033-bc7c-956e7a7721ae-kube-api-access-jrd5l\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775794 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-config\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775812 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-service-ca\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775845 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-audit-policies\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775863 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-encryption-config\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775882 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-oauth-config\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775906 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca7925f-e394-489d-afee-bfe1c49c0ced-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775927 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ths9g\" (UniqueName: \"kubernetes.io/projected/fb51f87b-5859-44b4-ae55-c4f11ed0237b-kube-api-access-ths9g\") pod \"downloads-7954f5f757-9p4cm\" (UID: \"fb51f87b-5859-44b4-ae55-c4f11ed0237b\") " pod="openshift-console/downloads-7954f5f757-9p4cm" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775947 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a248ae7c-6e03-4e10-bdd5-ef7e31335976-serving-cert\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775970 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-audit-dir\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.775987 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-client\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776012 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c330367-5495-4729-85ef-4ff602ab6808-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ca7925f-e394-489d-afee-bfe1c49c0ced-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776050 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/134ee45f-ab84-4033-bc7c-956e7a7721ae-serving-cert\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776076 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfpnd\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-kube-api-access-xfpnd\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776092 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776098 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htpcv\" (UniqueName: \"kubernetes.io/projected/a248ae7c-6e03-4e10-bdd5-ef7e31335976-kube-api-access-htpcv\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776117 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5sws\" (UniqueName: \"kubernetes.io/projected/9ca7925f-e394-489d-afee-bfe1c49c0ced-kube-api-access-t5sws\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776148 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776165 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwlrn\" (UniqueName: \"kubernetes.io/projected/b0766ac3-b78a-453e-a45e-ad88770d2513-kube-api-access-kwlrn\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776184 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-trusted-ca-bundle\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776208 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776238 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776225 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-serving-cert\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776264 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776289 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cd8x\" (UniqueName: \"kubernetes.io/projected/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-kube-api-access-9cd8x\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776313 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zkzx\" (UniqueName: \"kubernetes.io/projected/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-kube-api-access-9zkzx\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776338 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c330367-5495-4729-85ef-4ff602ab6808-metrics-tls\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776362 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-service-ca\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776384 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-config\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776404 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-etcd-client\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776431 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0766ac3-b78a-453e-a45e-ad88770d2513-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776451 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f97b6a23-48f6-459d-bed6-ccaa1c917e8a-metrics-tls\") pod \"dns-operator-744455d44c-vtck6\" (UID: \"f97b6a23-48f6-459d-bed6-ccaa1c917e8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776485 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5n5\" (UniqueName: \"kubernetes.io/projected/0c330367-5495-4729-85ef-4ff602ab6808-kube-api-access-6b5n5\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776506 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776547 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-ca\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776568 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-tls\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776588 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776607 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-certificates\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776626 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-trusted-ca\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776643 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0766ac3-b78a-453e-a45e-ad88770d2513-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776661 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-oauth-serving-cert\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776679 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z564m\" (UniqueName: \"kubernetes.io/projected/f97b6a23-48f6-459d-bed6-ccaa1c917e8a-kube-api-access-z564m\") pod \"dns-operator-744455d44c-vtck6\" (UID: \"f97b6a23-48f6-459d-bed6-ccaa1c917e8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776716 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wk9p\" (UniqueName: \"kubernetes.io/projected/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-kube-api-access-9wk9p\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776735 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776761 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776787 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776816 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-serving-cert\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.776857 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a248ae7c-6e03-4e10-bdd5-ef7e31335976-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.777292 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a248ae7c-6e03-4e10-bdd5-ef7e31335976-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.780061 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.780689 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-56xfg"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.781213 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-service-ca\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.781324 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brdsl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.781628 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.781741 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-audit-policies\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.781756 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.781793 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.782709 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: E0225 10:56:28.783205 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.283158031 +0000 UTC m=+214.781763857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.783371 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-trusted-ca-bundle\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.783487 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-oauth-serving-cert\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.783552 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-audit-dir\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.784130 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.784238 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-config\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.784851 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.785400 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.788921 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.789939 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.791109 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-oauth-config\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.791928 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.793264 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-serving-cert\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.795199 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-encryption-config\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.798195 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-serving-cert\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.798646 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a248ae7c-6e03-4e10-bdd5-ef7e31335976-serving-cert\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.804797 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.809905 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.810806 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.810930 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.812058 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.813309 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.820906 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-etcd-client\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.824391 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5c7g8"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.825053 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.825209 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.827552 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9p4cm"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.827603 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-q77vl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.828400 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-f4l29"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.829281 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mw7b2"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.830346 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.834424 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.835708 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7624"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.836010 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpmr4"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.837589 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.839658 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.843037 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5mvj"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.843135 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qsb7p"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.843839 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.844001 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.848419 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.855555 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.858407 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zszmh"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.859805 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ql8k8"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.861058 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.861916 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.862021 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.866287 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.869919 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6trwd"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.870931 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533616-zsh9g"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.873352 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nrlgl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.876053 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vtck6"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.877682 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.877945 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-ca\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: E0225 10:56:28.878186 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.378150852 +0000 UTC m=+214.876732877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878238 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-tls\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878281 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-trusted-ca\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878300 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0766ac3-b78a-453e-a45e-ad88770d2513-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878321 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-certificates\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878341 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z564m\" (UniqueName: \"kubernetes.io/projected/f97b6a23-48f6-459d-bed6-ccaa1c917e8a-kube-api-access-z564m\") pod \"dns-operator-744455d44c-vtck6\" (UID: \"f97b6a23-48f6-459d-bed6-ccaa1c917e8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878364 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878379 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wk9p\" (UniqueName: \"kubernetes.io/projected/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-kube-api-access-9wk9p\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878399 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878430 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878456 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-bound-sa-token\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878475 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c330367-5495-4729-85ef-4ff602ab6808-trusted-ca\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878496 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrd5l\" (UniqueName: \"kubernetes.io/projected/134ee45f-ab84-4033-bc7c-956e7a7721ae-kube-api-access-jrd5l\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878534 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca7925f-e394-489d-afee-bfe1c49c0ced-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878556 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ths9g\" (UniqueName: \"kubernetes.io/projected/fb51f87b-5859-44b4-ae55-c4f11ed0237b-kube-api-access-ths9g\") pod \"downloads-7954f5f757-9p4cm\" (UID: \"fb51f87b-5859-44b4-ae55-c4f11ed0237b\") " pod="openshift-console/downloads-7954f5f757-9p4cm" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-client\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878602 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfpnd\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-kube-api-access-xfpnd\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878620 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c330367-5495-4729-85ef-4ff602ab6808-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878637 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ca7925f-e394-489d-afee-bfe1c49c0ced-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878655 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/134ee45f-ab84-4033-bc7c-956e7a7721ae-serving-cert\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878684 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5sws\" (UniqueName: \"kubernetes.io/projected/9ca7925f-e394-489d-afee-bfe1c49c0ced-kube-api-access-t5sws\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878705 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878724 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwlrn\" (UniqueName: \"kubernetes.io/projected/b0766ac3-b78a-453e-a45e-ad88770d2513-kube-api-access-kwlrn\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878742 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878762 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-service-ca\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878793 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c330367-5495-4729-85ef-4ff602ab6808-metrics-tls\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878817 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-config\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878858 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0766ac3-b78a-453e-a45e-ad88770d2513-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f97b6a23-48f6-459d-bed6-ccaa1c917e8a-metrics-tls\") pod \"dns-operator-744455d44c-vtck6\" (UID: \"f97b6a23-48f6-459d-bed6-ccaa1c917e8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878899 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5n5\" (UniqueName: \"kubernetes.io/projected/0c330367-5495-4729-85ef-4ff602ab6808-kube-api-access-6b5n5\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878923 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.878923 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-ca\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.879899 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ca7925f-e394-489d-afee-bfe1c49c0ced-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: E0225 10:56:28.880429 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.3804026 +0000 UTC m=+214.878984805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.880746 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-service-ca\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.880868 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0766ac3-b78a-453e-a45e-ad88770d2513-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.881092 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c330367-5495-4729-85ef-4ff602ab6808-trusted-ca\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.881489 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/134ee45f-ab84-4033-bc7c-956e7a7721ae-config\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.882546 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-trusted-ca\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.882863 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.883083 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-certificates\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.883748 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/134ee45f-ab84-4033-bc7c-956e7a7721ae-etcd-client\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.883986 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.884293 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0766ac3-b78a-453e-a45e-ad88770d2513-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.884046 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.885515 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.886896 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-tls\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.887341 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c330367-5495-4729-85ef-4ff602ab6808-metrics-tls\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.887481 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f97b6a23-48f6-459d-bed6-ccaa1c917e8a-metrics-tls\") pod \"dns-operator-744455d44c-vtck6\" (UID: \"f97b6a23-48f6-459d-bed6-ccaa1c917e8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.887695 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca7925f-e394-489d-afee-bfe1c49c0ced-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.887701 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.888142 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vkmbp"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.890212 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/134ee45f-ab84-4033-bc7c-956e7a7721ae-serving-cert\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.891417 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.893133 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.894589 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5p82k"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.896069 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.897569 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ql8k8"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.899125 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zszmh"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.900616 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.902120 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.908227 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.909589 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.911134 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.913735 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-56xfg"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.916082 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.917634 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brdsl"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.918700 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.919699 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b6x7k"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.920441 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.920684 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b6x7k"] Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.922732 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.943044 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.962660 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.979373 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:28 crc kubenswrapper[4725]: E0225 10:56:28.979560 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.479540147 +0000 UTC m=+214.978122162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:28 crc kubenswrapper[4725]: I0225 10:56:28.983099 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.016434 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.028284 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.042982 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.065465 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.080643 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.081356 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.581336623 +0000 UTC m=+215.079918648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.087354 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.102942 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.122946 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.143070 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.162852 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.181794 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.181986 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.681957519 +0000 UTC m=+215.180539554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.182325 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.182757 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.682744199 +0000 UTC m=+215.181326224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.189649 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.203406 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.222935 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.244096 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.263207 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.283353 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.283523 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.783502578 +0000 UTC m=+215.282084593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.283744 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.283789 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.284075 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.784064082 +0000 UTC m=+215.282646117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.303275 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.324538 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.343898 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.363292 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.384303 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.384882 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.884860402 +0000 UTC m=+215.383442457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.392886 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.403990 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.424050 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.444106 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.463407 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.484186 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.485568 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.486007 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:29.985995551 +0000 UTC m=+215.484577576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.504320 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.523804 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.543738 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.564120 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.582893 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.586990 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.587187 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.08716423 +0000 UTC m=+215.585746305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.587423 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.587993 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.087968001 +0000 UTC m=+215.586550106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.603637 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.624715 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.643670 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.674577 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.683012 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.688536 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.688785 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.188761491 +0000 UTC m=+215.687343556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.689063 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.689502 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.189480089 +0000 UTC m=+215.688062154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.704568 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.723922 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.743735 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.763861 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.781237 4725 request.go:700] Waited for 1.017594206s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.783283 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.789966 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.790401 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.29031883 +0000 UTC m=+215.788900885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.791208 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.291186063 +0000 UTC m=+215.789768108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.790625 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.811478 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.823726 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.844530 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.863358 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.883420 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.893384 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.893984 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.393960884 +0000 UTC m=+215.892542949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.903164 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.923715 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.943560 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.964395 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.984546 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 25 10:56:29 crc kubenswrapper[4725]: I0225 10:56:29.995178 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:29 crc kubenswrapper[4725]: E0225 10:56:29.995703 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.495670227 +0000 UTC m=+215.994252292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.004348 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.023927 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.044321 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.063516 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.083486 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.096873 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.097074 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.597050422 +0000 UTC m=+216.095632457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.097438 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.097932 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.597911314 +0000 UTC m=+216.096493379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.103642 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.124272 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.144334 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.165781 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.184293 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.198191 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.198626 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.698596241 +0000 UTC m=+216.197178306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.204190 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.223485 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.272795 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cd8x\" (UniqueName: \"kubernetes.io/projected/7dc35e2f-3a10-41b4-ac03-753e62ff89a6-kube-api-access-9cd8x\") pod \"apiserver-7bbb656c7d-wntf7\" (UID: \"7dc35e2f-3a10-41b4-ac03-753e62ff89a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.293203 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zkzx\" (UniqueName: \"kubernetes.io/projected/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-kube-api-access-9zkzx\") pod \"console-f9d7485db-f4l29\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.299914 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.300622 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.800596102 +0000 UTC m=+216.299178167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.304314 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.310746 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htpcv\" (UniqueName: \"kubernetes.io/projected/a248ae7c-6e03-4e10-bdd5-ef7e31335976-kube-api-access-htpcv\") pod \"openshift-config-operator-7777fb866f-hwdf9\" (UID: \"a248ae7c-6e03-4e10-bdd5-ef7e31335976\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.324183 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.342998 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.363677 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.384058 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.401067 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.401619 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:30.901591087 +0000 UTC m=+216.400173142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.403023 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.424181 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.433391 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.443499 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.456076 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.463604 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.483299 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.502460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.502954 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.002934421 +0000 UTC m=+216.501516456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.503609 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.524028 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.544312 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.564248 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.580365 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.584747 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.603575 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.603757 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.103731902 +0000 UTC m=+216.602313917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.603929 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.604094 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.604814 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.104771868 +0000 UTC m=+216.603353913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.624368 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.643722 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.664929 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.665716 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7"] Feb 25 10:56:30 crc kubenswrapper[4725]: W0225 10:56:30.677176 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dc35e2f_3a10_41b4_ac03_753e62ff89a6.slice/crio-577aecf5aa63287f58a6e5cb45f40e056a0353a7c8ce99ce679fd42f870dfd07 WatchSource:0}: Error finding container 577aecf5aa63287f58a6e5cb45f40e056a0353a7c8ce99ce679fd42f870dfd07: Status 404 returned error can't find the container with id 577aecf5aa63287f58a6e5cb45f40e056a0353a7c8ce99ce679fd42f870dfd07 Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.680199 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9"] Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.683788 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.704034 4725 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.704514 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.705016 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.204948752 +0000 UTC m=+216.703530787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: W0225 10:56:30.709339 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda248ae7c_6e03_4e10_bdd5_ef7e31335976.slice/crio-c4143681950d45e12717714ebeef574bec242fe837955c79265d1874d84827d4 WatchSource:0}: Error finding container c4143681950d45e12717714ebeef574bec242fe837955c79265d1874d84827d4: Status 404 returned error can't find the container with id c4143681950d45e12717714ebeef574bec242fe837955c79265d1874d84827d4 Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.760430 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.765513 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-f4l29"] Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.781075 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfpnd\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-kube-api-access-xfpnd\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.781407 4725 request.go:700] Waited for 1.902153967s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/serviceaccounts/ingress-operator/token Feb 25 10:56:30 crc kubenswrapper[4725]: W0225 10:56:30.787274 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf8d8d2_144e_4232_bd68_b14a9f178c7d.slice/crio-250227931cfe3391d1bc3d1691f6a51a264dd4e6b5fbc799f9b6d13d5c296409 WatchSource:0}: Error finding container 250227931cfe3391d1bc3d1691f6a51a264dd4e6b5fbc799f9b6d13d5c296409: Status 404 returned error can't find the container with id 250227931cfe3391d1bc3d1691f6a51a264dd4e6b5fbc799f9b6d13d5c296409 Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.797926 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c330367-5495-4729-85ef-4ff602ab6808-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.806144 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.806439 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.30642921 +0000 UTC m=+216.805011235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.818105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wk9p\" (UniqueName: \"kubernetes.io/projected/b6ffed85-3d07-4bdb-80a0-60cde8b0b845-kube-api-access-9wk9p\") pod \"cluster-image-registry-operator-dc59b4c8b-8nwvl\" (UID: \"b6ffed85-3d07-4bdb-80a0-60cde8b0b845\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.830518 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.837143 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z564m\" (UniqueName: \"kubernetes.io/projected/f97b6a23-48f6-459d-bed6-ccaa1c917e8a-kube-api-access-z564m\") pod \"dns-operator-744455d44c-vtck6\" (UID: \"f97b6a23-48f6-459d-bed6-ccaa1c917e8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.861159 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-bound-sa-token\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.878372 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5n5\" (UniqueName: \"kubernetes.io/projected/0c330367-5495-4729-85ef-4ff602ab6808-kube-api-access-6b5n5\") pod \"ingress-operator-5b745b69d9-f8bfv\" (UID: \"0c330367-5495-4729-85ef-4ff602ab6808\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.903754 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5sws\" (UniqueName: \"kubernetes.io/projected/9ca7925f-e394-489d-afee-bfe1c49c0ced-kube-api-access-t5sws\") pod \"openshift-apiserver-operator-796bbdcf4f-4fpbw\" (UID: \"9ca7925f-e394-489d-afee-bfe1c49c0ced\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.908588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:30 crc kubenswrapper[4725]: E0225 10:56:30.908988 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.408961415 +0000 UTC m=+216.907543450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.910602 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.919121 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwlrn\" (UniqueName: \"kubernetes.io/projected/b0766ac3-b78a-453e-a45e-ad88770d2513-kube-api-access-kwlrn\") pod \"openshift-controller-manager-operator-756b6f6bc6-884q6\" (UID: \"b0766ac3-b78a-453e-a45e-ad88770d2513\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.928521 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.937147 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ths9g\" (UniqueName: \"kubernetes.io/projected/fb51f87b-5859-44b4-ae55-c4f11ed0237b-kube-api-access-ths9g\") pod \"downloads-7954f5f757-9p4cm\" (UID: \"fb51f87b-5859-44b4-ae55-c4f11ed0237b\") " pod="openshift-console/downloads-7954f5f757-9p4cm" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.957809 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrd5l\" (UniqueName: \"kubernetes.io/projected/134ee45f-ab84-4033-bc7c-956e7a7721ae-kube-api-access-jrd5l\") pod \"etcd-operator-b45778765-p5mvj\" (UID: \"134ee45f-ab84-4033-bc7c-956e7a7721ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.991909 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 25 10:56:30 crc kubenswrapper[4725]: I0225 10:56:30.998485 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.003562 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.010380 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.010629 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.510615937 +0000 UTC m=+217.009197962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.025447 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 25 10:56:31 crc kubenswrapper[4725]: W0225 10:56:31.030671 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ffed85_3d07_4bdb_80a0_60cde8b0b845.slice/crio-c893d13628c5b304ab04608dad6e6d07709eaa2441c9163f8a6a75a6719efcfc WatchSource:0}: Error finding container c893d13628c5b304ab04608dad6e6d07709eaa2441c9163f8a6a75a6719efcfc: Status 404 returned error can't find the container with id c893d13628c5b304ab04608dad6e6d07709eaa2441c9163f8a6a75a6719efcfc Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.043776 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.084219 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.088648 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vtck6"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.097891 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9p4cm" Feb 25 10:56:31 crc kubenswrapper[4725]: W0225 10:56:31.103675 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf97b6a23_48f6_459d_bed6_ccaa1c917e8a.slice/crio-d52e6d59dbf32df0a172cb207bc297fe378b0b136f6412f57bbacafe8ab45c63 WatchSource:0}: Error finding container d52e6d59dbf32df0a172cb207bc297fe378b0b136f6412f57bbacafe8ab45c63: Status 404 returned error can't find the container with id d52e6d59dbf32df0a172cb207bc297fe378b0b136f6412f57bbacafe8ab45c63 Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.109100 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.111416 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.111557 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.61153541 +0000 UTC m=+217.110117435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.111638 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f1b7c78-4561-435c-95c8-61939c32c761-proxy-tls\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.111671 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-config\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.111694 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-client-ca\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.111718 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c17d10e-278a-4879-a7c6-debfdd094f48-node-bootstrap-token\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.111743 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6199f7d7-c530-47d4-8cb6-1526dcba2266-service-ca-bundle\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.111938 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6cf0f13c-4d57-434c-9a4c-d7621e13350c-auth-proxy-config\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112096 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfsxl\" (UniqueName: \"kubernetes.io/projected/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-kube-api-access-qfsxl\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112126 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-stats-auth\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112284 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112311 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112358 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-serving-cert\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112382 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb7r6\" (UniqueName: \"kubernetes.io/projected/26ea044e-327f-4510-ae22-a6e7d61a6873-kube-api-access-xb7r6\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112466 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-image-import-ca\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112488 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58ea6113-66d2-421d-b7cd-723463055f04-config\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e992a203-4363-40e2-a056-11aa5e5f11c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-56xfg\" (UID: \"e992a203-4363-40e2-a056-11aa5e5f11c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112610 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112635 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-registration-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112658 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112679 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112700 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqqdd\" (UniqueName: \"kubernetes.io/projected/28b55dc2-29a7-4828-8471-68dc3baffac6-kube-api-access-fqqdd\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112720 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112750 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-serving-cert\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112771 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112838 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63b30c59-fa34-4b6f-ac0b-7db7bf370389-proxy-tls\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112886 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112907 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88082131-2ef3-4ffc-890b-132cad0248cb-config\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112927 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63b30c59-fa34-4b6f-ac0b-7db7bf370389-images\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112947 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63b30c59-fa34-4b6f-ac0b-7db7bf370389-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112970 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6s2z\" (UniqueName: \"kubernetes.io/projected/e9ec5ada-4472-4a6e-862e-be351ff0542e-kube-api-access-d6s2z\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.112990 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113011 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5t95\" (UniqueName: \"kubernetes.io/projected/e2f68f82-94c6-45cb-acda-3d903d0f216e-kube-api-access-r5t95\") pod \"migrator-59844c95c7-mfshs\" (UID: \"e2f68f82-94c6-45cb-acda-3d903d0f216e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113032 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-encryption-config\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113053 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-plugins-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113076 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55t2b\" (UniqueName: \"kubernetes.io/projected/b15e4920-ccda-4486-84ea-f48a51517d73-kube-api-access-55t2b\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113113 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113154 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113214 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-metrics-certs\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113250 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b55dc2-29a7-4828-8471-68dc3baffac6-serving-cert\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113280 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b899s\" (UniqueName: \"kubernetes.io/projected/3a1d826c-e67e-4932-ab2c-41e53f848529-kube-api-access-b899s\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113302 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ec5ada-4472-4a6e-862e-be351ff0542e-trusted-ca\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113324 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/93efef4f-c6c1-47b8-ba83-12c56c3b08ea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gbzbf\" (UID: \"93efef4f-c6c1-47b8-ba83-12c56c3b08ea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113346 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nv8w\" (UniqueName: \"kubernetes.io/projected/395baad9-33c8-426a-8fe6-2fefe9c35fc2-kube-api-access-4nv8w\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113364 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24bebe29-933d-4461-8aab-b7d17e815781-audit-dir\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113410 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2fa3801-20c3-4d68-88fc-0376b23f7b5d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgzgw\" (UID: \"d2fa3801-20c3-4d68-88fc-0376b23f7b5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113433 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ll8\" (UniqueName: \"kubernetes.io/projected/80388d06-cb04-46d8-ae7a-fdaf4c66049f-kube-api-access-l2ll8\") pod \"package-server-manager-789f6589d5-nxmh5\" (UID: \"80388d06-cb04-46d8-ae7a-fdaf4c66049f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113458 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-676nn\" (UniqueName: \"kubernetes.io/projected/de7222c9-af96-4a59-9188-b53187f1cbe3-kube-api-access-676nn\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113479 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-signing-key\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113499 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/395baad9-33c8-426a-8fe6-2fefe9c35fc2-profile-collector-cert\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113525 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113558 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-audit\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113581 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113603 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c17d10e-278a-4879-a7c6-debfdd094f48-certs\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113623 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl44l\" (UniqueName: \"kubernetes.io/projected/63b30c59-fa34-4b6f-ac0b-7db7bf370389-kube-api-access-pl44l\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113645 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-client-ca\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113668 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6cf0f13c-4d57-434c-9a4c-d7621e13350c-machine-approver-tls\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113688 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113732 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwkt9\" (UniqueName: \"kubernetes.io/projected/d2fa3801-20c3-4d68-88fc-0376b23f7b5d-kube-api-access-bwkt9\") pod \"cluster-samples-operator-665b6dd947-qgzgw\" (UID: \"d2fa3801-20c3-4d68-88fc-0376b23f7b5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113763 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113798 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-default-certificate\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113893 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmfn\" (UniqueName: \"kubernetes.io/projected/6cf0f13c-4d57-434c-9a4c-d7621e13350c-kube-api-access-9wmfn\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113962 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkhrk\" (UniqueName: \"kubernetes.io/projected/6199f7d7-c530-47d4-8cb6-1526dcba2266-kube-api-access-rkhrk\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.113995 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvkh\" (UniqueName: \"kubernetes.io/projected/93efef4f-c6c1-47b8-ba83-12c56c3b08ea-kube-api-access-ggvkh\") pod \"control-plane-machine-set-operator-78cbb6b69f-gbzbf\" (UID: \"93efef4f-c6c1-47b8-ba83-12c56c3b08ea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114024 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/598f09de-0be8-418a-a306-45517047d114-metrics-tls\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114056 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-socket-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114088 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/395baad9-33c8-426a-8fe6-2fefe9c35fc2-srv-cert\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114140 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7z8h\" (UniqueName: \"kubernetes.io/projected/08fe5978-cb79-459f-b51a-b8f769ea177f-kube-api-access-t7z8h\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114171 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f1b7c78-4561-435c-95c8-61939c32c761-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114220 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ec5ada-4472-4a6e-862e-be351ff0542e-config\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114250 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14a31736-63a7-443c-a99b-b03a2c285f37-srv-cert\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114314 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc37ca79-806f-43b5-a818-a642aa281d69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114347 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114379 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-audit-policies\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114409 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598f09de-0be8-418a-a306-45517047d114-config-volume\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114457 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl79n\" (UniqueName: \"kubernetes.io/projected/14a31736-63a7-443c-a99b-b03a2c285f37-kube-api-access-gl79n\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114486 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nht5s\" (UniqueName: \"kubernetes.io/projected/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-kube-api-access-nht5s\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114518 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114588 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/58ea6113-66d2-421d-b7cd-723463055f04-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114618 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-config\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114651 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z86fm\" (UniqueName: \"kubernetes.io/projected/c35a3cc3-c02a-43bc-aba2-22117865c274-kube-api-access-z86fm\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114680 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08fe5978-cb79-459f-b51a-b8f769ea177f-config-volume\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114711 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmw2\" (UniqueName: \"kubernetes.io/projected/58ea6113-66d2-421d-b7cd-723463055f04-kube-api-access-4hmw2\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114745 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwl5p\" (UniqueName: \"kubernetes.io/projected/e992a203-4363-40e2-a056-11aa5e5f11c3-kube-api-access-zwl5p\") pod \"multus-admission-controller-857f4d67dd-56xfg\" (UID: \"e992a203-4363-40e2-a056-11aa5e5f11c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114793 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114878 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqtj7\" (UniqueName: \"kubernetes.io/projected/7c17d10e-278a-4879-a7c6-debfdd094f48-kube-api-access-gqtj7\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114916 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a1d826c-e67e-4932-ab2c-41e53f848529-apiservice-cert\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114936 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ea044e-327f-4510-ae22-a6e7d61a6873-serving-cert\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114960 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-config\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.114993 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cf0f13c-4d57-434c-9a4c-d7621e13350c-config\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115014 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115036 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc37ca79-806f-43b5-a818-a642aa281d69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115057 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-service-ca-bundle\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115116 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c35a3cc3-c02a-43bc-aba2-22117865c274-audit-dir\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115137 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88082131-2ef3-4ffc-890b-132cad0248cb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115160 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-mountpoint-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115182 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-etcd-client\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115202 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2gvm\" (UniqueName: \"kubernetes.io/projected/24bebe29-933d-4461-8aab-b7d17e815781-kube-api-access-r2gvm\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115264 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/80388d06-cb04-46d8-ae7a-fdaf4c66049f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nxmh5\" (UID: \"80388d06-cb04-46d8-ae7a-fdaf4c66049f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115286 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08fe5978-cb79-459f-b51a-b8f769ea177f-secret-volume\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115310 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c35a3cc3-c02a-43bc-aba2-22117865c274-node-pullsecrets\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115332 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a1d826c-e67e-4932-ab2c-41e53f848529-webhook-cert\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115353 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115378 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x9nl\" (UniqueName: \"kubernetes.io/projected/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-kube-api-access-7x9nl\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115402 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-config\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115425 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc37ca79-806f-43b5-a818-a642aa281d69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115457 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14a31736-63a7-443c-a99b-b03a2c285f37-profile-collector-cert\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115482 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-csi-data-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115505 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-signing-cabundle\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115527 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/58ea6113-66d2-421d-b7cd-723463055f04-images\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115548 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ec5ada-4472-4a6e-862e-be351ff0542e-serving-cert\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115561 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115571 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88082131-2ef3-4ffc-890b-132cad0248cb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.115675 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9h2\" (UniqueName: \"kubernetes.io/projected/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-kube-api-access-2l9h2\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.116553 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.616534318 +0000 UTC m=+217.115116343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117023 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-config\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117058 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ll8c\" (UniqueName: \"kubernetes.io/projected/b0b17a01-64f4-4578-9e56-19825cfa713f-kube-api-access-9ll8c\") pod \"auto-csr-approver-29533616-zsh9g\" (UID: \"b0b17a01-64f4-4578-9e56-19825cfa713f\") " pod="openshift-infra/auto-csr-approver-29533616-zsh9g" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117102 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15e4920-ccda-4486-84ea-f48a51517d73-serving-cert\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117121 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx5qs\" (UniqueName: \"kubernetes.io/projected/598f09de-0be8-418a-a306-45517047d114-kube-api-access-qx5qs\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117307 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmx5\" (UniqueName: \"kubernetes.io/projected/4f1b7c78-4561-435c-95c8-61939c32c761-kube-api-access-xpmx5\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117339 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-etcd-serving-ca\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117376 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a1d826c-e67e-4932-ab2c-41e53f848529-tmpfs\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.117415 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-config\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: W0225 10:56:31.137670 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c330367_5495_4729_85ef_4ff602ab6808.slice/crio-e8dd89804e2b9d05b3a7450058f7865a7d758218bb0990acb0e361701e2235e4 WatchSource:0}: Error finding container e8dd89804e2b9d05b3a7450058f7865a7d758218bb0990acb0e361701e2235e4: Status 404 returned error can't find the container with id e8dd89804e2b9d05b3a7450058f7865a7d758218bb0990acb0e361701e2235e4 Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.165791 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218110 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.218263 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.718231951 +0000 UTC m=+217.216813986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218621 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb7r6\" (UniqueName: \"kubernetes.io/projected/26ea044e-327f-4510-ae22-a6e7d61a6873-kube-api-access-xb7r6\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218656 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-image-import-ca\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218671 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58ea6113-66d2-421d-b7cd-723463055f04-config\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218687 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e992a203-4363-40e2-a056-11aa5e5f11c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-56xfg\" (UID: \"e992a203-4363-40e2-a056-11aa5e5f11c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218705 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218724 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-registration-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218739 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218756 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218773 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqqdd\" (UniqueName: \"kubernetes.io/projected/28b55dc2-29a7-4828-8471-68dc3baffac6-kube-api-access-fqqdd\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.218973 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219033 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-serving-cert\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219051 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219075 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63b30c59-fa34-4b6f-ac0b-7db7bf370389-proxy-tls\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219106 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219123 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88082131-2ef3-4ffc-890b-132cad0248cb-config\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219137 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63b30c59-fa34-4b6f-ac0b-7db7bf370389-images\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219151 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63b30c59-fa34-4b6f-ac0b-7db7bf370389-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219167 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6s2z\" (UniqueName: \"kubernetes.io/projected/e9ec5ada-4472-4a6e-862e-be351ff0542e-kube-api-access-d6s2z\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219181 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219202 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5t95\" (UniqueName: \"kubernetes.io/projected/e2f68f82-94c6-45cb-acda-3d903d0f216e-kube-api-access-r5t95\") pod \"migrator-59844c95c7-mfshs\" (UID: \"e2f68f82-94c6-45cb-acda-3d903d0f216e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219219 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-encryption-config\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219234 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-plugins-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219252 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55t2b\" (UniqueName: \"kubernetes.io/projected/b15e4920-ccda-4486-84ea-f48a51517d73-kube-api-access-55t2b\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219275 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219295 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219311 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-metrics-certs\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219328 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b55dc2-29a7-4828-8471-68dc3baffac6-serving-cert\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b899s\" (UniqueName: \"kubernetes.io/projected/3a1d826c-e67e-4932-ab2c-41e53f848529-kube-api-access-b899s\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219363 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ec5ada-4472-4a6e-862e-be351ff0542e-trusted-ca\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219379 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/93efef4f-c6c1-47b8-ba83-12c56c3b08ea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gbzbf\" (UID: \"93efef4f-c6c1-47b8-ba83-12c56c3b08ea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219396 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nv8w\" (UniqueName: \"kubernetes.io/projected/395baad9-33c8-426a-8fe6-2fefe9c35fc2-kube-api-access-4nv8w\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219415 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24bebe29-933d-4461-8aab-b7d17e815781-audit-dir\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219440 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2fa3801-20c3-4d68-88fc-0376b23f7b5d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgzgw\" (UID: \"d2fa3801-20c3-4d68-88fc-0376b23f7b5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219456 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ll8\" (UniqueName: \"kubernetes.io/projected/80388d06-cb04-46d8-ae7a-fdaf4c66049f-kube-api-access-l2ll8\") pod \"package-server-manager-789f6589d5-nxmh5\" (UID: \"80388d06-cb04-46d8-ae7a-fdaf4c66049f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219476 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-676nn\" (UniqueName: \"kubernetes.io/projected/de7222c9-af96-4a59-9188-b53187f1cbe3-kube-api-access-676nn\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-signing-key\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219511 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/395baad9-33c8-426a-8fe6-2fefe9c35fc2-profile-collector-cert\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219529 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219528 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-registration-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219550 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-audit\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219591 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219621 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c17d10e-278a-4879-a7c6-debfdd094f48-certs\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219646 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24bceb1c-b610-49b3-9c13-410339a6755d-cert\") pod \"ingress-canary-b6x7k\" (UID: \"24bceb1c-b610-49b3-9c13-410339a6755d\") " pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219671 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl44l\" (UniqueName: \"kubernetes.io/projected/63b30c59-fa34-4b6f-ac0b-7db7bf370389-kube-api-access-pl44l\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219695 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-client-ca\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219718 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6cf0f13c-4d57-434c-9a4c-d7621e13350c-machine-approver-tls\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219739 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwkt9\" (UniqueName: \"kubernetes.io/projected/d2fa3801-20c3-4d68-88fc-0376b23f7b5d-kube-api-access-bwkt9\") pod \"cluster-samples-operator-665b6dd947-qgzgw\" (UID: \"d2fa3801-20c3-4d68-88fc-0376b23f7b5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219811 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219866 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-default-certificate\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219891 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmfn\" (UniqueName: \"kubernetes.io/projected/6cf0f13c-4d57-434c-9a4c-d7621e13350c-kube-api-access-9wmfn\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219916 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkhrk\" (UniqueName: \"kubernetes.io/projected/6199f7d7-c530-47d4-8cb6-1526dcba2266-kube-api-access-rkhrk\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219941 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvkh\" (UniqueName: \"kubernetes.io/projected/93efef4f-c6c1-47b8-ba83-12c56c3b08ea-kube-api-access-ggvkh\") pod \"control-plane-machine-set-operator-78cbb6b69f-gbzbf\" (UID: \"93efef4f-c6c1-47b8-ba83-12c56c3b08ea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.219965 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/598f09de-0be8-418a-a306-45517047d114-metrics-tls\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220014 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-socket-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220037 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/395baad9-33c8-426a-8fe6-2fefe9c35fc2-srv-cert\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220066 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7z8h\" (UniqueName: \"kubernetes.io/projected/08fe5978-cb79-459f-b51a-b8f769ea177f-kube-api-access-t7z8h\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220091 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f1b7c78-4561-435c-95c8-61939c32c761-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220115 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ec5ada-4472-4a6e-862e-be351ff0542e-config\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220118 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58ea6113-66d2-421d-b7cd-723463055f04-config\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14a31736-63a7-443c-a99b-b03a2c285f37-srv-cert\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220162 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220185 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc37ca79-806f-43b5-a818-a642aa281d69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220214 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220240 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-audit-policies\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220264 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598f09de-0be8-418a-a306-45517047d114-config-volume\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220286 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl79n\" (UniqueName: \"kubernetes.io/projected/14a31736-63a7-443c-a99b-b03a2c285f37-kube-api-access-gl79n\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220308 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nht5s\" (UniqueName: \"kubernetes.io/projected/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-kube-api-access-nht5s\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220333 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/58ea6113-66d2-421d-b7cd-723463055f04-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220379 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-config\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220405 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z86fm\" (UniqueName: \"kubernetes.io/projected/c35a3cc3-c02a-43bc-aba2-22117865c274-kube-api-access-z86fm\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220428 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08fe5978-cb79-459f-b51a-b8f769ea177f-config-volume\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220449 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmw2\" (UniqueName: \"kubernetes.io/projected/58ea6113-66d2-421d-b7cd-723463055f04-kube-api-access-4hmw2\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220471 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwl5p\" (UniqueName: \"kubernetes.io/projected/e992a203-4363-40e2-a056-11aa5e5f11c3-kube-api-access-zwl5p\") pod \"multus-admission-controller-857f4d67dd-56xfg\" (UID: \"e992a203-4363-40e2-a056-11aa5e5f11c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220516 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqtj7\" (UniqueName: \"kubernetes.io/projected/7c17d10e-278a-4879-a7c6-debfdd094f48-kube-api-access-gqtj7\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220542 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a1d826c-e67e-4932-ab2c-41e53f848529-apiservice-cert\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220564 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ea044e-327f-4510-ae22-a6e7d61a6873-serving-cert\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220589 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-config\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220612 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cf0f13c-4d57-434c-9a4c-d7621e13350c-config\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220634 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220655 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc37ca79-806f-43b5-a818-a642aa281d69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220676 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-service-ca-bundle\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220712 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c35a3cc3-c02a-43bc-aba2-22117865c274-audit-dir\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220739 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88082131-2ef3-4ffc-890b-132cad0248cb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220741 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63b30c59-fa34-4b6f-ac0b-7db7bf370389-images\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220764 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-mountpoint-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220788 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-etcd-client\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220811 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2gvm\" (UniqueName: \"kubernetes.io/projected/24bebe29-933d-4461-8aab-b7d17e815781-kube-api-access-r2gvm\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220840 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63b30c59-fa34-4b6f-ac0b-7db7bf370389-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220861 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/80388d06-cb04-46d8-ae7a-fdaf4c66049f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nxmh5\" (UID: \"80388d06-cb04-46d8-ae7a-fdaf4c66049f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220901 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08fe5978-cb79-459f-b51a-b8f769ea177f-secret-volume\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220924 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c35a3cc3-c02a-43bc-aba2-22117865c274-node-pullsecrets\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220945 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a1d826c-e67e-4932-ab2c-41e53f848529-webhook-cert\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220970 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.220998 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x9nl\" (UniqueName: \"kubernetes.io/projected/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-kube-api-access-7x9nl\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221024 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-config\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221050 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc37ca79-806f-43b5-a818-a642aa281d69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221079 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx4xj\" (UniqueName: \"kubernetes.io/projected/24bceb1c-b610-49b3-9c13-410339a6755d-kube-api-access-wx4xj\") pod \"ingress-canary-b6x7k\" (UID: \"24bceb1c-b610-49b3-9c13-410339a6755d\") " pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221111 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221122 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14a31736-63a7-443c-a99b-b03a2c285f37-profile-collector-cert\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221164 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-csi-data-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221196 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-signing-cabundle\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221223 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/58ea6113-66d2-421d-b7cd-723463055f04-images\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221223 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221246 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ec5ada-4472-4a6e-862e-be351ff0542e-serving-cert\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221224 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ec5ada-4472-4a6e-862e-be351ff0542e-trusted-ca\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221272 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88082131-2ef3-4ffc-890b-132cad0248cb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221299 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9h2\" (UniqueName: \"kubernetes.io/projected/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-kube-api-access-2l9h2\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221325 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-config\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221349 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ll8c\" (UniqueName: \"kubernetes.io/projected/b0b17a01-64f4-4578-9e56-19825cfa713f-kube-api-access-9ll8c\") pod \"auto-csr-approver-29533616-zsh9g\" (UID: \"b0b17a01-64f4-4578-9e56-19825cfa713f\") " pod="openshift-infra/auto-csr-approver-29533616-zsh9g" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221378 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15e4920-ccda-4486-84ea-f48a51517d73-serving-cert\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221401 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx5qs\" (UniqueName: \"kubernetes.io/projected/598f09de-0be8-418a-a306-45517047d114-kube-api-access-qx5qs\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221425 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmx5\" (UniqueName: \"kubernetes.io/projected/4f1b7c78-4561-435c-95c8-61939c32c761-kube-api-access-xpmx5\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221448 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-etcd-serving-ca\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221468 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a1d826c-e67e-4932-ab2c-41e53f848529-tmpfs\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-config\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221519 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f1b7c78-4561-435c-95c8-61939c32c761-proxy-tls\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221542 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-config\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221563 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-client-ca\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221585 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c17d10e-278a-4879-a7c6-debfdd094f48-node-bootstrap-token\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221608 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6199f7d7-c530-47d4-8cb6-1526dcba2266-service-ca-bundle\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221632 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6cf0f13c-4d57-434c-9a4c-d7621e13350c-auth-proxy-config\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221657 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfsxl\" (UniqueName: \"kubernetes.io/projected/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-kube-api-access-qfsxl\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221679 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-stats-auth\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221695 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24bebe29-933d-4461-8aab-b7d17e815781-audit-dir\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221701 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221751 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221777 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-serving-cert\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.221964 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.222264 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.222356 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-plugins-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.222474 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.223435 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88082131-2ef3-4ffc-890b-132cad0248cb-config\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.224996 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-client-ca\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.225018 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-config\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.225239 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-image-import-ca\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.225449 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-audit\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.225544 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.226020 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-csi-data-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.226072 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.726055172 +0000 UTC m=+217.224637297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.226903 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.227212 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/93efef4f-c6c1-47b8-ba83-12c56c3b08ea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gbzbf\" (UID: \"93efef4f-c6c1-47b8-ba83-12c56c3b08ea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.227351 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-serving-cert\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.228122 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-serving-cert\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.228566 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.228609 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cf0f13c-4d57-434c-9a4c-d7621e13350c-config\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.231026 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c35a3cc3-c02a-43bc-aba2-22117865c274-audit-dir\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.231198 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-mountpoint-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.231741 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f1b7c78-4561-435c-95c8-61939c32c761-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.231962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/58ea6113-66d2-421d-b7cd-723463055f04-images\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.232092 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ec5ada-4472-4a6e-862e-be351ff0542e-config\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.232220 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63b30c59-fa34-4b6f-ac0b-7db7bf370389-proxy-tls\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.232351 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-config\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.232802 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-client-ca\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.232980 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28b55dc2-29a7-4828-8471-68dc3baffac6-service-ca-bundle\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.233082 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-config\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.233136 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-signing-cabundle\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.233256 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc37ca79-806f-43b5-a818-a642aa281d69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.233302 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de7222c9-af96-4a59-9188-b53187f1cbe3-socket-dir\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.233563 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08fe5978-cb79-459f-b51a-b8f769ea177f-config-volume\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.234087 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-signing-key\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.234110 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-audit-policies\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.234317 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.234398 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-encryption-config\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.234850 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a1d826c-e67e-4932-ab2c-41e53f848529-tmpfs\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.234927 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6cf0f13c-4d57-434c-9a4c-d7621e13350c-machine-approver-tls\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.234929 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-config\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.235171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.235557 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6cf0f13c-4d57-434c-9a4c-d7621e13350c-auth-proxy-config\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.235569 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-etcd-serving-ca\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.235901 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6199f7d7-c530-47d4-8cb6-1526dcba2266-service-ca-bundle\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.235945 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598f09de-0be8-418a-a306-45517047d114-config-volume\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.236185 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e992a203-4363-40e2-a056-11aa5e5f11c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-56xfg\" (UID: \"e992a203-4363-40e2-a056-11aa5e5f11c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.236355 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c35a3cc3-c02a-43bc-aba2-22117865c274-node-pullsecrets\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.236496 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.236756 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2fa3801-20c3-4d68-88fc-0376b23f7b5d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgzgw\" (UID: \"d2fa3801-20c3-4d68-88fc-0376b23f7b5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.236858 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35a3cc3-c02a-43bc-aba2-22117865c274-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.237262 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-config\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.237591 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.237839 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-config\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.237845 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b55dc2-29a7-4828-8471-68dc3baffac6-serving-cert\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.238720 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15e4920-ccda-4486-84ea-f48a51517d73-serving-cert\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.239040 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.239233 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/598f09de-0be8-418a-a306-45517047d114-metrics-tls\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.239599 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14a31736-63a7-443c-a99b-b03a2c285f37-profile-collector-cert\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.240585 4725 generic.go:334] "Generic (PLEG): container finished" podID="a248ae7c-6e03-4e10-bdd5-ef7e31335976" containerID="781467ed934f8799ec24fbb3641f19c7d8a60f34f4c64773739f4ffaf0d38bf4" exitCode=0 Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.241041 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ea044e-327f-4510-ae22-a6e7d61a6873-serving-cert\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.241789 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/395baad9-33c8-426a-8fe6-2fefe9c35fc2-profile-collector-cert\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.244180 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c17d10e-278a-4879-a7c6-debfdd094f48-node-bootstrap-token\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.246102 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f1b7c78-4561-435c-95c8-61939c32c761-proxy-tls\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.246369 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.246940 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c17d10e-278a-4879-a7c6-debfdd094f48-certs\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.247250 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08fe5978-cb79-459f-b51a-b8f769ea177f-secret-volume\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.247466 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.247474 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a1d826c-e67e-4932-ab2c-41e53f848529-webhook-cert\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.247571 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.247741 4725 generic.go:334] "Generic (PLEG): container finished" podID="7dc35e2f-3a10-41b4-ac03-753e62ff89a6" containerID="f0326814a3f77b632cadfdeeea5e742a08102a7f81d25e4410c6d392869176b1" exitCode=0 Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.247915 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c35a3cc3-c02a-43bc-aba2-22117865c274-etcd-client\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.248318 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-metrics-certs\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.249378 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ec5ada-4472-4a6e-862e-be351ff0542e-serving-cert\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.250976 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/80388d06-cb04-46d8-ae7a-fdaf4c66049f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nxmh5\" (UID: \"80388d06-cb04-46d8-ae7a-fdaf4c66049f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.251306 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14a31736-63a7-443c-a99b-b03a2c285f37-srv-cert\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.253036 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/58ea6113-66d2-421d-b7cd-723463055f04-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.253093 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-stats-auth\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.253239 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88082131-2ef3-4ffc-890b-132cad0248cb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.253464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc37ca79-806f-43b5-a818-a642aa281d69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.253729 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6199f7d7-c530-47d4-8cb6-1526dcba2266-default-certificate\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.254053 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.254488 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.254523 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/395baad9-33c8-426a-8fe6-2fefe9c35fc2-srv-cert\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.255667 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a1d826c-e67e-4932-ab2c-41e53f848529-apiservice-cert\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.262052 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqqdd\" (UniqueName: \"kubernetes.io/projected/28b55dc2-29a7-4828-8471-68dc3baffac6-kube-api-access-fqqdd\") pod \"authentication-operator-69f744f599-vkmbp\" (UID: \"28b55dc2-29a7-4828-8471-68dc3baffac6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.279127 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb7r6\" (UniqueName: \"kubernetes.io/projected/26ea044e-327f-4510-ae22-a6e7d61a6873-kube-api-access-xb7r6\") pod \"controller-manager-879f6c89f-nrlgl\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289297 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" event={"ID":"f97b6a23-48f6-459d-bed6-ccaa1c917e8a","Type":"ContainerStarted","Data":"d52e6d59dbf32df0a172cb207bc297fe378b0b136f6412f57bbacafe8ab45c63"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289342 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" event={"ID":"a248ae7c-6e03-4e10-bdd5-ef7e31335976","Type":"ContainerDied","Data":"781467ed934f8799ec24fbb3641f19c7d8a60f34f4c64773739f4ffaf0d38bf4"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289361 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" event={"ID":"a248ae7c-6e03-4e10-bdd5-ef7e31335976","Type":"ContainerStarted","Data":"c4143681950d45e12717714ebeef574bec242fe837955c79265d1874d84827d4"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289374 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" event={"ID":"0c330367-5495-4729-85ef-4ff602ab6808","Type":"ContainerStarted","Data":"e8dd89804e2b9d05b3a7450058f7865a7d758218bb0990acb0e361701e2235e4"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289386 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" event={"ID":"7dc35e2f-3a10-41b4-ac03-753e62ff89a6","Type":"ContainerDied","Data":"f0326814a3f77b632cadfdeeea5e742a08102a7f81d25e4410c6d392869176b1"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289399 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" event={"ID":"7dc35e2f-3a10-41b4-ac03-753e62ff89a6","Type":"ContainerStarted","Data":"577aecf5aa63287f58a6e5cb45f40e056a0353a7c8ce99ce679fd42f870dfd07"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289412 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" event={"ID":"b6ffed85-3d07-4bdb-80a0-60cde8b0b845","Type":"ContainerStarted","Data":"fa9f7b0dc57b4f4e4270e50dd6fd7fc6f22f26f9406ca8d980864205b03a01f4"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289431 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" event={"ID":"b6ffed85-3d07-4bdb-80a0-60cde8b0b845","Type":"ContainerStarted","Data":"c893d13628c5b304ab04608dad6e6d07709eaa2441c9163f8a6a75a6719efcfc"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289439 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f4l29" event={"ID":"dcf8d8d2-144e-4232-bd68-b14a9f178c7d","Type":"ContainerStarted","Data":"23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.289448 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f4l29" event={"ID":"dcf8d8d2-144e-4232-bd68-b14a9f178c7d","Type":"ContainerStarted","Data":"250227931cfe3391d1bc3d1691f6a51a264dd4e6b5fbc799f9b6d13d5c296409"} Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.301035 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ll8\" (UniqueName: \"kubernetes.io/projected/80388d06-cb04-46d8-ae7a-fdaf4c66049f-kube-api-access-l2ll8\") pod \"package-server-manager-789f6589d5-nxmh5\" (UID: \"80388d06-cb04-46d8-ae7a-fdaf4c66049f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.320681 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6s2z\" (UniqueName: \"kubernetes.io/projected/e9ec5ada-4472-4a6e-862e-be351ff0542e-kube-api-access-d6s2z\") pod \"console-operator-58897d9998-5p82k\" (UID: \"e9ec5ada-4472-4a6e-862e-be351ff0542e\") " pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.322598 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.322888 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24bceb1c-b610-49b3-9c13-410339a6755d-cert\") pod \"ingress-canary-b6x7k\" (UID: \"24bceb1c-b610-49b3-9c13-410339a6755d\") " pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.323202 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx4xj\" (UniqueName: \"kubernetes.io/projected/24bceb1c-b610-49b3-9c13-410339a6755d-kube-api-access-wx4xj\") pod \"ingress-canary-b6x7k\" (UID: \"24bceb1c-b610-49b3-9c13-410339a6755d\") " pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.324124 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.824098622 +0000 UTC m=+217.322680657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.329557 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24bceb1c-b610-49b3-9c13-410339a6755d-cert\") pod \"ingress-canary-b6x7k\" (UID: \"24bceb1c-b610-49b3-9c13-410339a6755d\") " pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.339706 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5t95\" (UniqueName: \"kubernetes.io/projected/e2f68f82-94c6-45cb-acda-3d903d0f216e-kube-api-access-r5t95\") pod \"migrator-59844c95c7-mfshs\" (UID: \"e2f68f82-94c6-45cb-acda-3d903d0f216e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.356554 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.360922 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nv8w\" (UniqueName: \"kubernetes.io/projected/395baad9-33c8-426a-8fe6-2fefe9c35fc2-kube-api-access-4nv8w\") pod \"catalog-operator-68c6474976-tnjxg\" (UID: \"395baad9-33c8-426a-8fe6-2fefe9c35fc2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.388635 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-676nn\" (UniqueName: \"kubernetes.io/projected/de7222c9-af96-4a59-9188-b53187f1cbe3-kube-api-access-676nn\") pod \"csi-hostpathplugin-ql8k8\" (UID: \"de7222c9-af96-4a59-9188-b53187f1cbe3\") " pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.394271 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.401277 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.404285 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fa402ac-57ab-4b91-b7a7-2b3d6dae192f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rxn4n\" (UID: \"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.408881 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.417571 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.426083 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.426465 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:31.926454262 +0000 UTC m=+217.425036277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.431459 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55t2b\" (UniqueName: \"kubernetes.io/projected/b15e4920-ccda-4486-84ea-f48a51517d73-kube-api-access-55t2b\") pod \"route-controller-manager-6576b87f9c-77cqr\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.436296 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc37ca79-806f-43b5-a818-a642aa281d69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hnqfj\" (UID: \"dc37ca79-806f-43b5-a818-a642aa281d69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.445792 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9p4cm"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.474286 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl44l\" (UniqueName: \"kubernetes.io/projected/63b30c59-fa34-4b6f-ac0b-7db7bf370389-kube-api-access-pl44l\") pod \"machine-config-operator-74547568cd-wc2rt\" (UID: \"63b30c59-fa34-4b6f-ac0b-7db7bf370389\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.480715 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.485737 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwkt9\" (UniqueName: \"kubernetes.io/projected/d2fa3801-20c3-4d68-88fc-0376b23f7b5d-kube-api-access-bwkt9\") pod \"cluster-samples-operator-665b6dd947-qgzgw\" (UID: \"d2fa3801-20c3-4d68-88fc-0376b23f7b5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.507706 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.516148 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.519529 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7z8h\" (UniqueName: \"kubernetes.io/projected/08fe5978-cb79-459f-b51a-b8f769ea177f-kube-api-access-t7z8h\") pod \"collect-profiles-29533605-22g2l\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.523745 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5mvj"] Feb 25 10:56:31 crc kubenswrapper[4725]: W0225 10:56:31.526354 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca7925f_e394_489d_afee_bfe1c49c0ced.slice/crio-e3ee18067de62f83f53137827246d5b5199bc2d803d16cd883430f881933e1d0 WatchSource:0}: Error finding container e3ee18067de62f83f53137827246d5b5199bc2d803d16cd883430f881933e1d0: Status 404 returned error can't find the container with id e3ee18067de62f83f53137827246d5b5199bc2d803d16cd883430f881933e1d0 Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.526790 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.527038 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.027012756 +0000 UTC m=+217.525594781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.540910 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx5qs\" (UniqueName: \"kubernetes.io/projected/598f09de-0be8-418a-a306-45517047d114-kube-api-access-qx5qs\") pod \"dns-default-zszmh\" (UID: \"598f09de-0be8-418a-a306-45517047d114\") " pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.541658 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.562285 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.564348 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.565453 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvkh\" (UniqueName: \"kubernetes.io/projected/93efef4f-c6c1-47b8-ba83-12c56c3b08ea-kube-api-access-ggvkh\") pod \"control-plane-machine-set-operator-78cbb6b69f-gbzbf\" (UID: \"93efef4f-c6c1-47b8-ba83-12c56c3b08ea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.591543 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkhrk\" (UniqueName: \"kubernetes.io/projected/6199f7d7-c530-47d4-8cb6-1526dcba2266-kube-api-access-rkhrk\") pod \"router-default-5444994796-7lb6x\" (UID: \"6199f7d7-c530-47d4-8cb6-1526dcba2266\") " pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.601747 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmx5\" (UniqueName: \"kubernetes.io/projected/4f1b7c78-4561-435c-95c8-61939c32c761-kube-api-access-xpmx5\") pod \"machine-config-controller-84d6567774-bx66x\" (UID: \"4f1b7c78-4561-435c-95c8-61939c32c761\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.623732 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5p82k"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.623954 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.628004 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl79n\" (UniqueName: \"kubernetes.io/projected/14a31736-63a7-443c-a99b-b03a2c285f37-kube-api-access-gl79n\") pod \"olm-operator-6b444d44fb-njgfq\" (UID: \"14a31736-63a7-443c-a99b-b03a2c285f37\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.628769 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.629180 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.12916166 +0000 UTC m=+217.627743685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.631897 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.641021 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nht5s\" (UniqueName: \"kubernetes.io/projected/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-kube-api-access-nht5s\") pod \"marketplace-operator-79b997595-m7624\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.649318 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.657956 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88082131-2ef3-4ffc-890b-132cad0248cb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6kmdm\" (UID: \"88082131-2ef3-4ffc-890b-132cad0248cb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.665695 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.672590 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nrlgl"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.701080 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ll8c\" (UniqueName: \"kubernetes.io/projected/b0b17a01-64f4-4578-9e56-19825cfa713f-kube-api-access-9ll8c\") pod \"auto-csr-approver-29533616-zsh9g\" (UID: \"b0b17a01-64f4-4578-9e56-19825cfa713f\") " pod="openshift-infra/auto-csr-approver-29533616-zsh9g" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.702353 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmfn\" (UniqueName: \"kubernetes.io/projected/6cf0f13c-4d57-434c-9a4c-d7621e13350c-kube-api-access-9wmfn\") pod \"machine-approver-56656f9798-shjdf\" (UID: \"6cf0f13c-4d57-434c-9a4c-d7621e13350c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.710286 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.722532 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b899s\" (UniqueName: \"kubernetes.io/projected/3a1d826c-e67e-4932-ab2c-41e53f848529-kube-api-access-b899s\") pod \"packageserver-d55dfcdfc-zxhvz\" (UID: \"3a1d826c-e67e-4932-ab2c-41e53f848529\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.725501 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.730025 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.730222 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.230191796 +0000 UTC m=+217.728773831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.730382 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.730745 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.23073498 +0000 UTC m=+217.729317085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.740107 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" Feb 25 10:56:31 crc kubenswrapper[4725]: W0225 10:56:31.743529 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ea044e_327f_4510_ae22_a6e7d61a6873.slice/crio-07fb12677a4eb51ca5f6dc3a21766d9babc58f68a8d20c668a2e5da09ae80765 WatchSource:0}: Error finding container 07fb12677a4eb51ca5f6dc3a21766d9babc58f68a8d20c668a2e5da09ae80765: Status 404 returned error can't find the container with id 07fb12677a4eb51ca5f6dc3a21766d9babc58f68a8d20c668a2e5da09ae80765 Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.745190 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.754544 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.755551 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vkmbp"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.758813 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2gvm\" (UniqueName: \"kubernetes.io/projected/24bebe29-933d-4461-8aab-b7d17e815781-kube-api-access-r2gvm\") pod \"oauth-openshift-558db77b4-6trwd\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.761950 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z86fm\" (UniqueName: \"kubernetes.io/projected/c35a3cc3-c02a-43bc-aba2-22117865c274-kube-api-access-z86fm\") pod \"apiserver-76f77b778f-qsb7p\" (UID: \"c35a3cc3-c02a-43bc-aba2-22117865c274\") " pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.762090 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.780453 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9h2\" (UniqueName: \"kubernetes.io/projected/dc0834fc-dc53-4913-93a1-a76b1ebf7d0c-kube-api-access-2l9h2\") pod \"kube-storage-version-migrator-operator-b67b599dd-57bqz\" (UID: \"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.794757 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.800635 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfsxl\" (UniqueName: \"kubernetes.io/projected/c7f92e9e-983b-42cb-9eb3-28c8f5a0c848-kube-api-access-qfsxl\") pod \"service-ca-operator-777779d784-q77vl\" (UID: \"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.811819 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5"] Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.825488 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmw2\" (UniqueName: \"kubernetes.io/projected/58ea6113-66d2-421d-b7cd-723463055f04-kube-api-access-4hmw2\") pod \"machine-api-operator-5694c8668f-mw7b2\" (UID: \"58ea6113-66d2-421d-b7cd-723463055f04\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.832543 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.833958 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.333939812 +0000 UTC m=+217.832521837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.846705 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwl5p\" (UniqueName: \"kubernetes.io/projected/e992a203-4363-40e2-a056-11aa5e5f11c3-kube-api-access-zwl5p\") pod \"multus-admission-controller-857f4d67dd-56xfg\" (UID: \"e992a203-4363-40e2-a056-11aa5e5f11c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.847135 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.859248 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqtj7\" (UniqueName: \"kubernetes.io/projected/7c17d10e-278a-4879-a7c6-debfdd094f48-kube-api-access-gqtj7\") pod \"machine-config-server-5c7g8\" (UID: \"7c17d10e-278a-4879-a7c6-debfdd094f48\") " pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.879778 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.880399 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.880987 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x9nl\" (UniqueName: \"kubernetes.io/projected/42ef2d6e-c9d8-44ed-b5f9-b0853923968f-kube-api-access-7x9nl\") pod \"service-ca-9c57cc56f-brdsl\" (UID: \"42ef2d6e-c9d8-44ed-b5f9-b0853923968f\") " pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.898136 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx4xj\" (UniqueName: \"kubernetes.io/projected/24bceb1c-b610-49b3-9c13-410339a6755d-kube-api-access-wx4xj\") pod \"ingress-canary-b6x7k\" (UID: \"24bceb1c-b610-49b3-9c13-410339a6755d\") " pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.916485 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.934581 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:31 crc kubenswrapper[4725]: E0225 10:56:31.934921 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.434910497 +0000 UTC m=+217.933492522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.936452 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.948508 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" Feb 25 10:56:31 crc kubenswrapper[4725]: W0225 10:56:31.962594 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80388d06_cb04_46d8_ae7a_fdaf4c66049f.slice/crio-108a39a08e59901195d68b0895d06e8b5cf2399d527aea97afc15c4fc557f306 WatchSource:0}: Error finding container 108a39a08e59901195d68b0895d06e8b5cf2399d527aea97afc15c4fc557f306: Status 404 returned error can't find the container with id 108a39a08e59901195d68b0895d06e8b5cf2399d527aea97afc15c4fc557f306 Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.972109 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.985225 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:31 crc kubenswrapper[4725]: I0225 10:56:31.987943 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.035215 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.035398 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.535369148 +0000 UTC m=+218.033951163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.036088 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.036525 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.536509097 +0000 UTC m=+218.035091122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.073343 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.081165 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.099608 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.129410 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5c7g8" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.137888 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.138249 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.638233831 +0000 UTC m=+218.136815856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.173782 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b6x7k" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.177530 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.181007 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.228490 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zszmh"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.239054 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ql8k8"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.239896 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.240247 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.740229502 +0000 UTC m=+218.238811527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.287332 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" event={"ID":"f97b6a23-48f6-459d-bed6-ccaa1c917e8a","Type":"ContainerStarted","Data":"d198ce885b1f7139b2b2771bb4485f7c1e3fb535dbb6df1f494c833b60572cb2"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.298008 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" event={"ID":"80388d06-cb04-46d8-ae7a-fdaf4c66049f","Type":"ContainerStarted","Data":"108a39a08e59901195d68b0895d06e8b5cf2399d527aea97afc15c4fc557f306"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.303087 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" event={"ID":"b0766ac3-b78a-453e-a45e-ad88770d2513","Type":"ContainerStarted","Data":"a15e237a8cc1097e905621c80755402e5bede6065fceea987760a1a71cd3c01a"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.303129 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" event={"ID":"b0766ac3-b78a-453e-a45e-ad88770d2513","Type":"ContainerStarted","Data":"49047bd567772dfc0bfb3fc07f4886423206dd27c63920378bfc5e3f4429e447"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.308791 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.314431 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.322936 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9p4cm" event={"ID":"fb51f87b-5859-44b4-ae55-c4f11ed0237b","Type":"ContainerStarted","Data":"863e453f1237ca30e52ee47f930fa1bea74aaaf93633c564f473da263b63ef87"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.322973 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9p4cm" event={"ID":"fb51f87b-5859-44b4-ae55-c4f11ed0237b","Type":"ContainerStarted","Data":"57062b79152eaf81d70e2d5c0e270a427845d99634ca81d666d9ed538e3fc99b"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.324095 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9p4cm" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.336174 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.341520 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-9p4cm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.341564 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9p4cm" podUID="fb51f87b-5859-44b4-ae55-c4f11ed0237b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.342160 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.342446 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.842430968 +0000 UTC m=+218.341012983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.348319 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" event={"ID":"26ea044e-327f-4510-ae22-a6e7d61a6873","Type":"ContainerStarted","Data":"07fb12677a4eb51ca5f6dc3a21766d9babc58f68a8d20c668a2e5da09ae80765"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.365944 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" event={"ID":"7dc35e2f-3a10-41b4-ac03-753e62ff89a6","Type":"ContainerStarted","Data":"7f95745853cc250a986234080e6e97d1e4fbf1df4522ac191790e770c2be40dd"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.399656 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" event={"ID":"134ee45f-ab84-4033-bc7c-956e7a7721ae","Type":"ContainerStarted","Data":"109adb9a0b0e46cb615d5e48190be3a5f656a4ce70a61483b66b2da85a615def"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.403622 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" event={"ID":"28b55dc2-29a7-4828-8471-68dc3baffac6","Type":"ContainerStarted","Data":"dbd0dc39a79c72e39d151cc57ec274ab428ee3e3948d9acedd9d0cb874e1ac74"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.405310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5p82k" event={"ID":"e9ec5ada-4472-4a6e-862e-be351ff0542e","Type":"ContainerStarted","Data":"743191c0378e776d50687b304bb68e3c39b13443fcf2a8ee92eabd605f3278b9"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.406550 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" event={"ID":"395baad9-33c8-426a-8fe6-2fefe9c35fc2","Type":"ContainerStarted","Data":"06d4e5aaefbe5baecfba5bc9cd49c12948c7338309fec93539228a44b504691c"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.407883 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" event={"ID":"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f","Type":"ContainerStarted","Data":"f8c0a1b1c72af6857e2fa14557d73a1051b5875c60a9da70ef42450a91bb7909"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.410095 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" event={"ID":"0c330367-5495-4729-85ef-4ff602ab6808","Type":"ContainerStarted","Data":"d14e53bcf7050fed6b0f110a6ecc1102b462b88475cff2ded661580bb55aef41"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.410140 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" event={"ID":"0c330367-5495-4729-85ef-4ff602ab6808","Type":"ContainerStarted","Data":"c7ac369743483aee38498bfec602aedeaccc595ad18daf0e3507455b29626661"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.414309 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" event={"ID":"9ca7925f-e394-489d-afee-bfe1c49c0ced","Type":"ContainerStarted","Data":"f9f1dc24138769f7beb7b4034d8f2380d458aebc5a04e80f1f014b060966f50e"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.414344 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" event={"ID":"9ca7925f-e394-489d-afee-bfe1c49c0ced","Type":"ContainerStarted","Data":"e3ee18067de62f83f53137827246d5b5199bc2d803d16cd883430f881933e1d0"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.443595 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.445388 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:32.945376662 +0000 UTC m=+218.443958687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.448265 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" event={"ID":"a248ae7c-6e03-4e10-bdd5-ef7e31335976","Type":"ContainerStarted","Data":"44e67069d3532a9085e696ca2bd64d1aa0f190a2249bc4108b67704c6383b936"} Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.456906 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9p4cm" podStartSLOduration=163.456891868 podStartE2EDuration="2m43.456891868s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:32.416397818 +0000 UTC m=+217.914979833" watchObservedRunningTime="2026-02-25 10:56:32.456891868 +0000 UTC m=+217.955473893" Feb 25 10:56:32 crc kubenswrapper[4725]: W0225 10:56:32.511288 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6199f7d7_c530_47d4_8cb6_1526dcba2266.slice/crio-8ec43f4179d63a9058925c7707475f96f4a43e20153506f7b4be8cfbe76750b1 WatchSource:0}: Error finding container 8ec43f4179d63a9058925c7707475f96f4a43e20153506f7b4be8cfbe76750b1: Status 404 returned error can't find the container with id 8ec43f4179d63a9058925c7707475f96f4a43e20153506f7b4be8cfbe76750b1 Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.546358 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.546492 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.04646813 +0000 UTC m=+218.545050155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.548712 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.552818 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.052804793 +0000 UTC m=+218.551386818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.581282 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8nwvl" podStartSLOduration=163.581261394 podStartE2EDuration="2m43.581261394s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:32.537703895 +0000 UTC m=+218.036285920" watchObservedRunningTime="2026-02-25 10:56:32.581261394 +0000 UTC m=+218.079843419" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.649542 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.649735 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.149704443 +0000 UTC m=+218.648286478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.649844 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.650133 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.150120123 +0000 UTC m=+218.648702148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: W0225 10:56:32.653572 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb15e4920_ccda_4486_84ea_f48a51517d73.slice/crio-73cb9bea73f809ee4bba4c5ff8c6432ed9cb08212c5f372b53e998861287d035 WatchSource:0}: Error finding container 73cb9bea73f809ee4bba4c5ff8c6432ed9cb08212c5f372b53e998861287d035: Status 404 returned error can't find the container with id 73cb9bea73f809ee4bba4c5ff8c6432ed9cb08212c5f372b53e998861287d035 Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.655686 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.699533 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.748939 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.751211 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.756505 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.256485426 +0000 UTC m=+218.755067451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.788390 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" podStartSLOduration=163.788373776 podStartE2EDuration="2m43.788373776s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:32.787235647 +0000 UTC m=+218.285817672" watchObservedRunningTime="2026-02-25 10:56:32.788373776 +0000 UTC m=+218.286955801" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.858053 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.858692 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.358680433 +0000 UTC m=+218.857262458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.864345 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-884q6" podStartSLOduration=163.864327018 podStartE2EDuration="2m43.864327018s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:32.821344783 +0000 UTC m=+218.319926818" watchObservedRunningTime="2026-02-25 10:56:32.864327018 +0000 UTC m=+218.362909043" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.940331 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-f4l29" podStartSLOduration=163.94031814 podStartE2EDuration="2m43.94031814s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:32.904333686 +0000 UTC m=+218.402915731" watchObservedRunningTime="2026-02-25 10:56:32.94031814 +0000 UTC m=+218.438900155" Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.942979 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz"] Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.959707 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:32 crc kubenswrapper[4725]: I0225 10:56:32.960729 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw"] Feb 25 10:56:32 crc kubenswrapper[4725]: E0225 10:56:32.961088 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.461067693 +0000 UTC m=+218.959649718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.054046 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533616-zsh9g"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.077433 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.077476 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.077933 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.577904076 +0000 UTC m=+219.076486101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.180059 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.180679 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.680665846 +0000 UTC m=+219.179247871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.254468 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6trwd"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.274541 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mw7b2"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.282429 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.283173 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.78315672 +0000 UTC m=+219.281738745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.345228 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-brdsl"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.375288 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-56xfg"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.387600 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.387899 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.887884511 +0000 UTC m=+219.386466536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.394048 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qsb7p"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.408953 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.410795 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-q77vl"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.458495 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.489337 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7624"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.492998 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.493515 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:33.993499935 +0000 UTC m=+219.492081960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.524864 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b6x7k"] Feb 25 10:56:33 crc kubenswrapper[4725]: W0225 10:56:33.527844 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a31736_63a7_443c_a99b_b03a2c285f37.slice/crio-9f5fd149c3c7ec53f9b8524083f529a173e6ae5e901a59f29d8ff05dd73c7c39 WatchSource:0}: Error finding container 9f5fd149c3c7ec53f9b8524083f529a173e6ae5e901a59f29d8ff05dd73c7c39: Status 404 returned error can't find the container with id 9f5fd149c3c7ec53f9b8524083f529a173e6ae5e901a59f29d8ff05dd73c7c39 Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.541145 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" event={"ID":"134ee45f-ab84-4033-bc7c-956e7a7721ae","Type":"ContainerStarted","Data":"42c1df016340e751e2a584172bfbf10c68a44e2224d9b7bb3e6553e0d5c9ea0e"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.543359 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz"] Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.570561 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" event={"ID":"de7222c9-af96-4a59-9188-b53187f1cbe3","Type":"ContainerStarted","Data":"5896bafa7ac6e69ac255b3a8f6aea9806cc1d03b999df33e6584f264d849ccc5"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.596999 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.598439 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.09841542 +0000 UTC m=+219.596997505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.618056 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" podStartSLOduration=164.618030004 podStartE2EDuration="2m44.618030004s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.616242518 +0000 UTC m=+219.114824553" watchObservedRunningTime="2026-02-25 10:56:33.618030004 +0000 UTC m=+219.116612039" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.643798 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" event={"ID":"26ea044e-327f-4510-ae22-a6e7d61a6873","Type":"ContainerStarted","Data":"6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.645346 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.651192 4725 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nrlgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.651353 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" podUID="26ea044e-327f-4510-ae22-a6e7d61a6873" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.723492 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4fpbw" podStartSLOduration=164.7225534 podStartE2EDuration="2m44.7225534s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.706361734 +0000 UTC m=+219.204943749" watchObservedRunningTime="2026-02-25 10:56:33.7225534 +0000 UTC m=+219.221135425" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.728133 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.729305 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" event={"ID":"b15e4920-ccda-4486-84ea-f48a51517d73","Type":"ContainerStarted","Data":"73cb9bea73f809ee4bba4c5ff8c6432ed9cb08212c5f372b53e998861287d035"} Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.729982 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.229965081 +0000 UTC m=+219.728547106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.730163 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.733480 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" event={"ID":"80388d06-cb04-46d8-ae7a-fdaf4c66049f","Type":"ContainerStarted","Data":"15b855817a5c500839d39adaa23cca4943da32ce27237fd5de0e3c0e4f98e0ad"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.733650 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.734120 4725 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-77cqr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.734184 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" podUID="b15e4920-ccda-4486-84ea-f48a51517d73" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.736513 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" event={"ID":"b0b17a01-64f4-4578-9e56-19825cfa713f","Type":"ContainerStarted","Data":"2d9b3fc9c0db4a55aff037595f1438561a1d93f5c574561047a2b34bc933b843"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.746796 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" event={"ID":"88082131-2ef3-4ffc-890b-132cad0248cb","Type":"ContainerStarted","Data":"3fc7a13006f9d2e8df0dac881c6628f5aa14cac9b84e3d08584d79a837b6c27e"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.750660 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" event={"ID":"63b30c59-fa34-4b6f-ac0b-7db7bf370389","Type":"ContainerStarted","Data":"4bad90c1e90224c477cbb6ee15a9ea2a796f9d567296cf950fdb516df2d6415a"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.762437 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7lb6x" event={"ID":"6199f7d7-c530-47d4-8cb6-1526dcba2266","Type":"ContainerStarted","Data":"a0197bd7993925bd96b39c963b963011b54d3f4c6d96753d5ded5ac9ff5a4a46"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.762482 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7lb6x" event={"ID":"6199f7d7-c530-47d4-8cb6-1526dcba2266","Type":"ContainerStarted","Data":"8ec43f4179d63a9058925c7707475f96f4a43e20153506f7b4be8cfbe76750b1"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.764252 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-p5mvj" podStartSLOduration=164.764228501 podStartE2EDuration="2m44.764228501s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.742594605 +0000 UTC m=+219.241176630" watchObservedRunningTime="2026-02-25 10:56:33.764228501 +0000 UTC m=+219.262810526" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.767362 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" event={"ID":"dc37ca79-806f-43b5-a818-a642aa281d69","Type":"ContainerStarted","Data":"c0849455d61908fd74e1c6f257d51429b8ed9e690719804881d9ea85b369fc62"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.775572 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zszmh" event={"ID":"598f09de-0be8-418a-a306-45517047d114","Type":"ContainerStarted","Data":"6939e0c3dd480083a7f6bf10ad0051d8d39deeffd7ba354ebcd7c814544f9b62"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.790802 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f8bfv" podStartSLOduration=164.790785343 podStartE2EDuration="2m44.790785343s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.76418069 +0000 UTC m=+219.262762725" watchObservedRunningTime="2026-02-25 10:56:33.790785343 +0000 UTC m=+219.289367368" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.817193 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5p82k" event={"ID":"e9ec5ada-4472-4a6e-862e-be351ff0542e","Type":"ContainerStarted","Data":"f480f481b0e61b2d55adb5a07fa9b3e102150558146406afb9d63d7c859fc7f5"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.818955 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" podStartSLOduration=164.818936157 podStartE2EDuration="2m44.818936157s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.79064047 +0000 UTC m=+219.289222505" watchObservedRunningTime="2026-02-25 10:56:33.818936157 +0000 UTC m=+219.317518182" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.819557 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" podStartSLOduration=164.819552183 podStartE2EDuration="2m44.819552183s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.817299685 +0000 UTC m=+219.315881710" watchObservedRunningTime="2026-02-25 10:56:33.819552183 +0000 UTC m=+219.318134208" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.820208 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.823234 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" event={"ID":"4f1b7c78-4561-435c-95c8-61939c32c761","Type":"ContainerStarted","Data":"81c3c7595c3b854bcdc495df653adf4dfb864269579d15e3cffc3f983c184c1f"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.824294 4725 patch_prober.go:28] interesting pod/console-operator-58897d9998-5p82k container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.824332 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5p82k" podUID="e9ec5ada-4472-4a6e-862e-be351ff0542e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.836302 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.836889 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.336874118 +0000 UTC m=+219.835456143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.837105 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.842365 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.342350158 +0000 UTC m=+219.840932183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.850892 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.851732 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.851790 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.879678 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" event={"ID":"f97b6a23-48f6-459d-bed6-ccaa1c917e8a","Type":"ContainerStarted","Data":"e732aeda80b27bde8f8f3b6d40a10199640b14779fa34bd2ccff41bebbda7f32"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.885465 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" event={"ID":"e2f68f82-94c6-45cb-acda-3d903d0f216e","Type":"ContainerStarted","Data":"184cf40e75990689ed91aab68f9d7fe06467c5ee45c5db26b43fd1170e25b2c1"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.885525 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" event={"ID":"e2f68f82-94c6-45cb-acda-3d903d0f216e","Type":"ContainerStarted","Data":"f3987c1591c2ade7ce28e70efce0778713a38dbe45f55aa474e54faa7cdd2929"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.898386 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" event={"ID":"4fa402ac-57ab-4b91-b7a7-2b3d6dae192f","Type":"ContainerStarted","Data":"4091fc612279c10e4f644a62edd37e9afa37a4e3c6a90d6ccbff5f8e9846e7fb"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.906697 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7lb6x" podStartSLOduration=164.906680391 podStartE2EDuration="2m44.906680391s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.860920076 +0000 UTC m=+219.359502111" watchObservedRunningTime="2026-02-25 10:56:33.906680391 +0000 UTC m=+219.405262416" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.920739 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" event={"ID":"28b55dc2-29a7-4828-8471-68dc3baffac6","Type":"ContainerStarted","Data":"04db2707a6b45f98c443b1171c2231d7f8f2744b0270ca21f30f2717a4ab0867"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.931442 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" event={"ID":"93efef4f-c6c1-47b8-ba83-12c56c3b08ea","Type":"ContainerStarted","Data":"80f1f1bfcd64a15c1e9e1162117509b68ea35b282483b674307a06181fff27cf"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.932893 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" event={"ID":"6cf0f13c-4d57-434c-9a4c-d7621e13350c","Type":"ContainerStarted","Data":"01b90fe8d85a3a2991cffbe39adcb38ae0045de948020eb5572514ce49b71fb5"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.935349 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" podStartSLOduration=164.935338098 podStartE2EDuration="2m44.935338098s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.903691025 +0000 UTC m=+219.402273060" watchObservedRunningTime="2026-02-25 10:56:33.935338098 +0000 UTC m=+219.433920123" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.936133 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rxn4n" podStartSLOduration=164.936129208 podStartE2EDuration="2m44.936129208s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.935747338 +0000 UTC m=+219.434329373" watchObservedRunningTime="2026-02-25 10:56:33.936129208 +0000 UTC m=+219.434711233" Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.937116 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" event={"ID":"08fe5978-cb79-459f-b51a-b8f769ea177f","Type":"ContainerStarted","Data":"c2ab7b9ad8e452c921bc3ec6ba0a7db5c7b6a271d4a401ab7492d483593f3ede"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.938140 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:33 crc kubenswrapper[4725]: E0225 10:56:33.940748 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.440718266 +0000 UTC m=+219.939300291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.942075 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" event={"ID":"58ea6113-66d2-421d-b7cd-723463055f04","Type":"ContainerStarted","Data":"34497933c0f2ac3d84257920556c4d0f6ac2cf3b4f7dd778d438408872e3218c"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.947767 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5c7g8" event={"ID":"7c17d10e-278a-4879-a7c6-debfdd094f48","Type":"ContainerStarted","Data":"da2fedd09d72f636937420b657e32cc7675026d2c2616d4b71ee42723bbb6ed7"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.953085 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" event={"ID":"24bebe29-933d-4461-8aab-b7d17e815781","Type":"ContainerStarted","Data":"d30fffcbdbaaf02fe2d40032c7e8b59a7420fd4a0ff9c0ecd61b48c426f360da"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.965000 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" event={"ID":"3a1d826c-e67e-4932-ab2c-41e53f848529","Type":"ContainerStarted","Data":"847d75de94a956f5ff9c399d8ccba2f1d28a0e4e3efae7f86cca897b8863a4cb"} Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.969762 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-9p4cm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 25 10:56:33 crc kubenswrapper[4725]: I0225 10:56:33.969807 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9p4cm" podUID="fb51f87b-5859-44b4-ae55-c4f11ed0237b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.010476 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5p82k" podStartSLOduration=165.010454838 podStartE2EDuration="2m45.010454838s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:33.983909816 +0000 UTC m=+219.482491841" watchObservedRunningTime="2026-02-25 10:56:34.010454838 +0000 UTC m=+219.509036863" Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.039222 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.040706 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.540678175 +0000 UTC m=+220.039260200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.059299 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vtck6" podStartSLOduration=165.059278812 podStartE2EDuration="2m45.059278812s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:34.019581772 +0000 UTC m=+219.518163797" watchObservedRunningTime="2026-02-25 10:56:34.059278812 +0000 UTC m=+219.557860837" Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.065076 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" podStartSLOduration=165.065057051 podStartE2EDuration="2m45.065057051s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:34.059943079 +0000 UTC m=+219.558525114" watchObservedRunningTime="2026-02-25 10:56:34.065057051 +0000 UTC m=+219.563639076" Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.098598 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5c7g8" podStartSLOduration=6.098582002 podStartE2EDuration="6.098582002s" podCreationTimestamp="2026-02-25 10:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:34.093138383 +0000 UTC m=+219.591720418" watchObservedRunningTime="2026-02-25 10:56:34.098582002 +0000 UTC m=+219.597164027" Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.145373 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.145978 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.6459527 +0000 UTC m=+220.144534735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.146287 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.149242 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.649212433 +0000 UTC m=+220.147794458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.265197 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.266172 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.766125797 +0000 UTC m=+220.264707832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.337904 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vkmbp" podStartSLOduration=165.337880331 podStartE2EDuration="2m45.337880331s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:34.145286532 +0000 UTC m=+219.643868597" watchObservedRunningTime="2026-02-25 10:56:34.337880331 +0000 UTC m=+219.836462366" Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.344362 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nrlgl"] Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.351723 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr"] Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.368506 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.368795 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.868782885 +0000 UTC m=+220.367364910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.469435 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.469717 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:34.969702458 +0000 UTC m=+220.468284483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.573439 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.573748 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.073736922 +0000 UTC m=+220.572318947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.678576 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.679178 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.179163141 +0000 UTC m=+220.677745166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.780517 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.780812 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.280801132 +0000 UTC m=+220.779383157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.865009 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:34 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:34 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:34 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.865064 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.881243 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.881630 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.381611493 +0000 UTC m=+220.880193518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.984242 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:34 crc kubenswrapper[4725]: E0225 10:56:34.984805 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.484792424 +0000 UTC m=+220.983374449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:34 crc kubenswrapper[4725]: I0225 10:56:34.986008 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" event={"ID":"e992a203-4363-40e2-a056-11aa5e5f11c3","Type":"ContainerStarted","Data":"36f604db45f2aad88b294bfe48336d0b3f9160c8f692ff6db60e3cf86c5946e5"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.001976 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" event={"ID":"dc37ca79-806f-43b5-a818-a642aa281d69","Type":"ContainerStarted","Data":"dc0cb76ae8157aeaa649687dfb3bcf5eff316e14d77027800e05081e10cb5715"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.009308 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" event={"ID":"58ea6113-66d2-421d-b7cd-723463055f04","Type":"ContainerStarted","Data":"37b870207952844888faae8a427705b365de0559e2ed338ae2fe431561c9aea8"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.018151 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" event={"ID":"42ef2d6e-c9d8-44ed-b5f9-b0853923968f","Type":"ContainerStarted","Data":"04de47a05313f25f72add14f20d84c7e1ef1e21324d58b2fadd3f920eb93afd4"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.018199 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" event={"ID":"42ef2d6e-c9d8-44ed-b5f9-b0853923968f","Type":"ContainerStarted","Data":"bc4bee7d34913895436de1cc795f29342cf66b76f45d0fa963349df1acfb3929"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.032726 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hnqfj" podStartSLOduration=166.032706355 podStartE2EDuration="2m46.032706355s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.030332994 +0000 UTC m=+220.528915019" watchObservedRunningTime="2026-02-25 10:56:35.032706355 +0000 UTC m=+220.531288380" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.033673 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5c7g8" event={"ID":"7c17d10e-278a-4879-a7c6-debfdd094f48","Type":"ContainerStarted","Data":"c247c18557ef7dca064217896bd087b4722362713d762c58f2867f599d55c6c8"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.050315 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" event={"ID":"6cf0f13c-4d57-434c-9a4c-d7621e13350c","Type":"ContainerStarted","Data":"f167c6d1970b85cc5fea9d1046c6c67cd52bac9e414d50fa2fbe030d8a05608a"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.053338 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" event={"ID":"395baad9-33c8-426a-8fe6-2fefe9c35fc2","Type":"ContainerStarted","Data":"18cb6909f7c5e740d1275ae6519e1afd962cdc9ffa2ab78827037e0dfa926252"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.053655 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.068094 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-brdsl" podStartSLOduration=166.068077004 podStartE2EDuration="2m46.068077004s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.057749419 +0000 UTC m=+220.556331444" watchObservedRunningTime="2026-02-25 10:56:35.068077004 +0000 UTC m=+220.566659029" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.079313 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.089272 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.090479 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.590464979 +0000 UTC m=+221.089047004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.095893 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" event={"ID":"24bebe29-933d-4461-8aab-b7d17e815781","Type":"ContainerStarted","Data":"c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.097138 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.100025 4725 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6trwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.100063 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" podUID="24bebe29-933d-4461-8aab-b7d17e815781" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.105149 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" event={"ID":"3a1d826c-e67e-4932-ab2c-41e53f848529","Type":"ContainerStarted","Data":"97cb1367d92c0198ffd613f6e1c734095b5f717bf8aa6b09d5178d3fd2f02a02"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.106037 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.118129 4725 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zxhvz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.118190 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" podUID="3a1d826c-e67e-4932-ab2c-41e53f848529" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.124359 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" event={"ID":"88082131-2ef3-4ffc-890b-132cad0248cb","Type":"ContainerStarted","Data":"0577554b5972c3e82b01cb00d8f6d35118f00406b9549823f328f22e427e680b"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.147102 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" event={"ID":"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848","Type":"ContainerStarted","Data":"87fb811617c4dfdb40e63358582cbd9077d1c3c1344ae9e5664b3d7da661d69a"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.147147 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" event={"ID":"c7f92e9e-983b-42cb-9eb3-28c8f5a0c848","Type":"ContainerStarted","Data":"12071cbcdf70626cadb27047545bda9caf203639ccd1b22148277715ab9b4c1d"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.154404 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tnjxg" podStartSLOduration=166.154388672 podStartE2EDuration="2m46.154388672s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.087695218 +0000 UTC m=+220.586277243" watchObservedRunningTime="2026-02-25 10:56:35.154388672 +0000 UTC m=+220.652970697" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.154691 4725 generic.go:334] "Generic (PLEG): container finished" podID="c35a3cc3-c02a-43bc-aba2-22117865c274" containerID="23d7bf3eae66c4a31c0ada040b121f924554b123080f0a23968ad76f3aa01b23" exitCode=0 Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.154759 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" event={"ID":"c35a3cc3-c02a-43bc-aba2-22117865c274","Type":"ContainerDied","Data":"23d7bf3eae66c4a31c0ada040b121f924554b123080f0a23968ad76f3aa01b23"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.154794 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" event={"ID":"c35a3cc3-c02a-43bc-aba2-22117865c274","Type":"ContainerStarted","Data":"cc1c08fff5e7015c112326a216d88efc60390cc4ae10232215810a0c90ae1734"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.198258 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.201487 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.701475792 +0000 UTC m=+221.200057807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.202527 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kmdm" podStartSLOduration=166.202508448 podStartE2EDuration="2m46.202508448s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.19830549 +0000 UTC m=+220.696887525" watchObservedRunningTime="2026-02-25 10:56:35.202508448 +0000 UTC m=+220.701090473" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.209386 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" event={"ID":"63b30c59-fa34-4b6f-ac0b-7db7bf370389","Type":"ContainerStarted","Data":"f8ca5a1e2860e9992dc5bd2b18a096737b3da649548e7bf62bcef7002fd78bc8"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.230520 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" podStartSLOduration=166.230504288 podStartE2EDuration="2m46.230504288s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.228081515 +0000 UTC m=+220.726663560" watchObservedRunningTime="2026-02-25 10:56:35.230504288 +0000 UTC m=+220.729086313" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.247953 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b6x7k" event={"ID":"24bceb1c-b610-49b3-9c13-410339a6755d","Type":"ContainerStarted","Data":"535d7b809e1497ba2681dc0a21411c342f7e298c0ea8de876cb87688ef905de9"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.249501 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" event={"ID":"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c","Type":"ContainerStarted","Data":"b4d848d1be2c93917eda1dc4024d20e58e799047b799b52e92e8cd126709c18c"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.252938 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" event={"ID":"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474","Type":"ContainerStarted","Data":"0fba3923d377ead43ca148ced91266be82ac2d3f9cc0d9b2ed601dbe007189f5"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.252971 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" event={"ID":"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474","Type":"ContainerStarted","Data":"99c9e06b2baa4d1223eeff0c502f98e8223eec3bb0f0407507eca3448d16e1a0"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.269886 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.285050 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" podStartSLOduration=166.285035469 podStartE2EDuration="2m46.285035469s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.283395487 +0000 UTC m=+220.781977522" watchObservedRunningTime="2026-02-25 10:56:35.285035469 +0000 UTC m=+220.783617494" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.286560 4725 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7624 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.286643 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.301645 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.303208 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.803188795 +0000 UTC m=+221.301770820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.337603 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q77vl" podStartSLOduration=166.337588609 podStartE2EDuration="2m46.337588609s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.337090656 +0000 UTC m=+220.835672691" watchObservedRunningTime="2026-02-25 10:56:35.337588609 +0000 UTC m=+220.836170634" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.347871 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" event={"ID":"de7222c9-af96-4a59-9188-b53187f1cbe3","Type":"ContainerStarted","Data":"baf335efc76d3b2e81ef97c6e3d849d31d0863059f8b30894b263c38fdecd198"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.397431 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" event={"ID":"4f1b7c78-4561-435c-95c8-61939c32c761","Type":"ContainerStarted","Data":"0ade04f60d56ffdbbf0b7e8ef5dc5f2dc1c3d8a831453f54862367744a60dd32"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.404600 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.404895 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:35.904883448 +0000 UTC m=+221.403465473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.428798 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" event={"ID":"93efef4f-c6c1-47b8-ba83-12c56c3b08ea","Type":"ContainerStarted","Data":"31f9ea17491d6f5725d4b576e2094fb4e67d85a695f01496425c10f2845a4e2c"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.433965 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.434004 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.451662 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" event={"ID":"d2fa3801-20c3-4d68-88fc-0376b23f7b5d","Type":"ContainerStarted","Data":"dfb9de8feabc0bfd2ea11704976af4b02f419a9d21c16c7828e832af5a981630"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.451706 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" event={"ID":"d2fa3801-20c3-4d68-88fc-0376b23f7b5d","Type":"ContainerStarted","Data":"73b45fa2fe638d9cc886f9bfbbea77e6771aecc48999d262e3a4f6f39e6ee0d7"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.465737 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" event={"ID":"80388d06-cb04-46d8-ae7a-fdaf4c66049f","Type":"ContainerStarted","Data":"00e73826a3bbfad45679a0b6ff950c2dfefd2cc939aa492adb432b3a115a068a"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.472786 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zszmh" event={"ID":"598f09de-0be8-418a-a306-45517047d114","Type":"ContainerStarted","Data":"2f254d9c119d37b8babb51936d463ae9d77ffe3a2ab9bcc5daaa16443e46c4a5"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.473504 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.474648 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.486518 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" event={"ID":"14a31736-63a7-443c-a99b-b03a2c285f37","Type":"ContainerStarted","Data":"97a92fe8af7e838210f8eca0aa2fcfa2e52ed7fd03ab1c632bfc489b3af2854d"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.486560 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" event={"ID":"14a31736-63a7-443c-a99b-b03a2c285f37","Type":"ContainerStarted","Data":"9f5fd149c3c7ec53f9b8524083f529a173e6ae5e901a59f29d8ff05dd73c7c39"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.487291 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.488986 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" event={"ID":"08fe5978-cb79-459f-b51a-b8f769ea177f","Type":"ContainerStarted","Data":"9efa1097b38368bb85aa4b081c9f8cb61478e622441ab1602bfa0088065f26de"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.495044 4725 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-njgfq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.495100 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" podUID="14a31736-63a7-443c-a99b-b03a2c285f37" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.498257 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" event={"ID":"b15e4920-ccda-4486-84ea-f48a51517d73","Type":"ContainerStarted","Data":"660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.505499 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.505663 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.005622057 +0000 UTC m=+221.504204082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.506023 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.508284 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.008270205 +0000 UTC m=+221.506852230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.510947 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.515570 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" event={"ID":"e2f68f82-94c6-45cb-acda-3d903d0f216e","Type":"ContainerStarted","Data":"bff29d49d786b175063b755a5af49244a42a4f44c17b01abad95a9ee370b4f85"} Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.516644 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-9p4cm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.516725 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9p4cm" podUID="fb51f87b-5859-44b4-ae55-c4f11ed0237b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.533860 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5p82k" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.539846 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wntf7" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.543067 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.548873 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hwdf9" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.554163 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" podStartSLOduration=166.554131613 podStartE2EDuration="2m46.554131613s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.551789373 +0000 UTC m=+221.050371398" watchObservedRunningTime="2026-02-25 10:56:35.554131613 +0000 UTC m=+221.052713638" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.606493 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.608368 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.108347916 +0000 UTC m=+221.606929941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.648671 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" podStartSLOduration=166.648656052 podStartE2EDuration="2m46.648656052s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.647191425 +0000 UTC m=+221.145773450" watchObservedRunningTime="2026-02-25 10:56:35.648656052 +0000 UTC m=+221.147238077" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.685395 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b6x7k" podStartSLOduration=7.685378506 podStartE2EDuration="7.685378506s" podCreationTimestamp="2026-02-25 10:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.673890051 +0000 UTC m=+221.172472086" watchObservedRunningTime="2026-02-25 10:56:35.685378506 +0000 UTC m=+221.183960531" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.707665 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" podStartSLOduration=166.707649728 podStartE2EDuration="2m46.707649728s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.707236197 +0000 UTC m=+221.205818232" watchObservedRunningTime="2026-02-25 10:56:35.707649728 +0000 UTC m=+221.206231753" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.708139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.708542 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.208529791 +0000 UTC m=+221.707111816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.808891 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.809081 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.309056464 +0000 UTC m=+221.807638489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.809231 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.809514 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.309502095 +0000 UTC m=+221.808084120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.855460 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:35 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:35 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:35 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.855514 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.886814 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zszmh" podStartSLOduration=7.886793891 podStartE2EDuration="7.886793891s" podCreationTimestamp="2026-02-25 10:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:35.878056087 +0000 UTC m=+221.376638112" watchObservedRunningTime="2026-02-25 10:56:35.886793891 +0000 UTC m=+221.385375916" Feb 25 10:56:35 crc kubenswrapper[4725]: I0225 10:56:35.910344 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:35 crc kubenswrapper[4725]: E0225 10:56:35.910759 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.410744667 +0000 UTC m=+221.909326692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.012468 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.030189 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mfshs" podStartSLOduration=167.030172574 podStartE2EDuration="2m47.030172574s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.029402685 +0000 UTC m=+221.527984710" watchObservedRunningTime="2026-02-25 10:56:36.030172574 +0000 UTC m=+221.528754599" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.046322 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.546304689 +0000 UTC m=+222.044886714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.113244 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.113488 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.613464625 +0000 UTC m=+222.112046650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.113723 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.114030 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.614018569 +0000 UTC m=+222.112600594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.175732 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" podStartSLOduration=167.175717674 podStartE2EDuration="2m47.175717674s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.1732211 +0000 UTC m=+221.671803145" watchObservedRunningTime="2026-02-25 10:56:36.175717674 +0000 UTC m=+221.674299699" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.214484 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.214818 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.714802299 +0000 UTC m=+222.213384324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.316086 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.316419 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.816407599 +0000 UTC m=+222.314989624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.417184 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.417462 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:36.917447676 +0000 UTC m=+222.416029701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.426661 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gbzbf" podStartSLOduration=167.426646582 podStartE2EDuration="2m47.426646582s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.425902713 +0000 UTC m=+221.924484738" watchObservedRunningTime="2026-02-25 10:56:36.426646582 +0000 UTC m=+221.925228617" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.482257 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" podStartSLOduration=167.48223696 podStartE2EDuration="2m47.48223696s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.476993966 +0000 UTC m=+221.975576001" watchObservedRunningTime="2026-02-25 10:56:36.48223696 +0000 UTC m=+221.980818985" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.517955 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.518464 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.018454181 +0000 UTC m=+222.517036206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.543392 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" event={"ID":"4f1b7c78-4561-435c-95c8-61939c32c761","Type":"ContainerStarted","Data":"41e0b2f152f1b3c543ec01f3ba8601049080facec2e8d8340ef5a4b6695c988d"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.560019 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b6x7k" event={"ID":"24bceb1c-b610-49b3-9c13-410339a6755d","Type":"ContainerStarted","Data":"cf5d975bd0a877347043695e56c6ac0add7653a28319264b7a4b765663564ded"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.579332 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" event={"ID":"58ea6113-66d2-421d-b7cd-723463055f04","Type":"ContainerStarted","Data":"1a404382f249c25593ea8f2b1d0d7b2d97caf7e318b8387a9ddf460bfdea78e3"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.585943 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgzgw" event={"ID":"d2fa3801-20c3-4d68-88fc-0376b23f7b5d","Type":"ContainerStarted","Data":"52c7291c1fc7bfdd05f95b01f84d216262738e3f89b73518e1cd0fc94b1a23a6"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.592918 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" event={"ID":"6cf0f13c-4d57-434c-9a4c-d7621e13350c","Type":"ContainerStarted","Data":"a905c99577d43cd2dcdf9d7d7c9cee60dcb9f04c58efaf5ce1d7f1176b60f44d"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.613682 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" event={"ID":"c35a3cc3-c02a-43bc-aba2-22117865c274","Type":"ContainerStarted","Data":"523f657ea98dc08cad8705a04e4db9e50264fd26376d0ddb366a5555bf0023b6"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.613726 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" event={"ID":"c35a3cc3-c02a-43bc-aba2-22117865c274","Type":"ContainerStarted","Data":"a79408fc68fe41b7bd637ec11e33ba55a28824ab50f81a15114a5e2ab580a280"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.630275 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.630343 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mw7b2" podStartSLOduration=167.630327016 podStartE2EDuration="2m47.630327016s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.615996127 +0000 UTC m=+222.114578172" watchObservedRunningTime="2026-02-25 10:56:36.630327016 +0000 UTC m=+222.128909041" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.631159 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bx66x" podStartSLOduration=167.631152987 podStartE2EDuration="2m47.631152987s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.515954987 +0000 UTC m=+222.014537002" watchObservedRunningTime="2026-02-25 10:56:36.631152987 +0000 UTC m=+222.129735012" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.631255 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.131237659 +0000 UTC m=+222.629819684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.643877 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wc2rt" event={"ID":"63b30c59-fa34-4b6f-ac0b-7db7bf370389","Type":"ContainerStarted","Data":"cbbef61c5ba32df019da04d8a46db423bfeb1402d31bdb743ed2fafb189778c1"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.669761 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-shjdf" podStartSLOduration=167.669747449 podStartE2EDuration="2m47.669747449s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.667993084 +0000 UTC m=+222.166575109" watchObservedRunningTime="2026-02-25 10:56:36.669747449 +0000 UTC m=+222.168329474" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.686281 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" event={"ID":"e992a203-4363-40e2-a056-11aa5e5f11c3","Type":"ContainerStarted","Data":"e5a68faca89c4df4d241871b4d91beb71c7ad7078f2e73013346d88d6de709d4"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.686323 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" event={"ID":"e992a203-4363-40e2-a056-11aa5e5f11c3","Type":"ContainerStarted","Data":"02fb71c646575bbe01ed44e33c3fecf9e2d03479a59cff2e95cf1621de8ee9fe"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.702354 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" podStartSLOduration=167.702337606 podStartE2EDuration="2m47.702337606s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.70172778 +0000 UTC m=+222.200309805" watchObservedRunningTime="2026-02-25 10:56:36.702337606 +0000 UTC m=+222.200919631" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.709580 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjxjp"] Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.710417 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.716593 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.724857 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zszmh" event={"ID":"598f09de-0be8-418a-a306-45517047d114","Type":"ContainerStarted","Data":"40b4ef3679c80b2e044507d8ce2cea659783ba6b4ff3478bcce698bcd81dd7ef"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.729633 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-56xfg" podStartSLOduration=167.729618547 podStartE2EDuration="2m47.729618547s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:36.726872827 +0000 UTC m=+222.225454852" watchObservedRunningTime="2026-02-25 10:56:36.729618547 +0000 UTC m=+222.228200572" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.737540 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-catalog-content\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.737586 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.737611 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl4qg\" (UniqueName: \"kubernetes.io/projected/8f0d98c3-7ffa-4029-ab5c-c252062b3099-kube-api-access-gl4qg\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.737637 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-utilities\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.739470 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.23945507 +0000 UTC m=+222.738037095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.755010 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjxjp"] Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.777405 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57bqz" event={"ID":"dc0834fc-dc53-4913-93a1-a76b1ebf7d0c","Type":"ContainerStarted","Data":"114ec8c91042b605ee760ee448328ac4c067fa9df73652731bb5fa088e69acc0"} Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.780924 4725 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7624 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.780971 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.784618 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" podUID="26ea044e-327f-4510-ae22-a6e7d61a6873" containerName="controller-manager" containerID="cri-o://6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a" gracePeriod=30 Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.786326 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" podUID="b15e4920-ccda-4486-84ea-f48a51517d73" containerName="route-controller-manager" containerID="cri-o://660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12" gracePeriod=30 Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.807913 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-njgfq" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.838502 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.839135 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-catalog-content\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.839218 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl4qg\" (UniqueName: \"kubernetes.io/projected/8f0d98c3-7ffa-4029-ab5c-c252062b3099-kube-api-access-gl4qg\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.839335 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-utilities\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.840055 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.340041604 +0000 UTC m=+222.838623629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.840993 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-catalog-content\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.850110 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-utilities\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.861696 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:36 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:36 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:36 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.861750 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.886404 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl4qg\" (UniqueName: \"kubernetes.io/projected/8f0d98c3-7ffa-4029-ab5c-c252062b3099-kube-api-access-gl4qg\") pod \"certified-operators-qjxjp\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.889002 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2tdp"] Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.890019 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.895875 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.923910 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2tdp"] Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.944493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.944555 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zw7\" (UniqueName: \"kubernetes.io/projected/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-kube-api-access-t7zw7\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.944587 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-catalog-content\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.944617 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-utilities\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:36 crc kubenswrapper[4725]: E0225 10:56:36.944928 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.444915949 +0000 UTC m=+222.943497974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.988952 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:36 crc kubenswrapper[4725]: I0225 10:56:36.989312 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.008956 4725 patch_prober.go:28] interesting pod/apiserver-76f77b778f-qsb7p container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.009007 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" podUID="c35a3cc3-c02a-43bc-aba2-22117865c274" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.043000 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.046269 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.046497 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zw7\" (UniqueName: \"kubernetes.io/projected/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-kube-api-access-t7zw7\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.046556 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-catalog-content\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.046585 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-utilities\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.046996 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-utilities\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.047071 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.547055214 +0000 UTC m=+223.045637239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.047516 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-catalog-content\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.073666 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bq27c"] Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.074606 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.086497 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zw7\" (UniqueName: \"kubernetes.io/projected/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-kube-api-access-t7zw7\") pod \"community-operators-l2tdp\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.094059 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bq27c"] Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.146142 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44496: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.147368 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-utilities\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.147429 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgnwr\" (UniqueName: \"kubernetes.io/projected/dec8f4b6-001e-4ce7-b6d4-55b197612a38-kube-api-access-vgnwr\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.147465 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-catalog-content\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.147493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.147766 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.647756401 +0000 UTC m=+223.146338416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.226201 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44504: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.248706 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.248996 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-utilities\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.249033 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgnwr\" (UniqueName: \"kubernetes.io/projected/dec8f4b6-001e-4ce7-b6d4-55b197612a38-kube-api-access-vgnwr\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.249072 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-catalog-content\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.249606 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-catalog-content\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.249698 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.7496788 +0000 UTC m=+223.248260825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.249988 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-utilities\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.291631 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgnwr\" (UniqueName: \"kubernetes.io/projected/dec8f4b6-001e-4ce7-b6d4-55b197612a38-kube-api-access-vgnwr\") pod \"certified-operators-bq27c\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.297418 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.299792 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dcstn"] Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.304623 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dcstn"] Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.304748 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.310364 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44508: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.350556 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.350596 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-utilities\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.350623 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-catalog-content\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.350659 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxw9b\" (UniqueName: \"kubernetes.io/projected/47446d07-b5cf-4646-b54b-0e841fb3a662-kube-api-access-lxw9b\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.351027 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.851005794 +0000 UTC m=+223.349587889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.432401 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44520: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.447563 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.451898 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.452136 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-catalog-content\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.452192 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxw9b\" (UniqueName: \"kubernetes.io/projected/47446d07-b5cf-4646-b54b-0e841fb3a662-kube-api-access-lxw9b\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.452310 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-utilities\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.453304 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:37.953286732 +0000 UTC m=+223.451868757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.496731 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-utilities\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.498105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-catalog-content\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.508955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxw9b\" (UniqueName: \"kubernetes.io/projected/47446d07-b5cf-4646-b54b-0e841fb3a662-kube-api-access-lxw9b\") pod \"community-operators-dcstn\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.529900 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44530: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.554125 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.554521 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.054506013 +0000 UTC m=+223.553088038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.633694 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44542: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.652801 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.655613 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.655979 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.1559619 +0000 UTC m=+223.654543925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.677289 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.700479 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d985b9fd6-2zlh6"] Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.700668 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ea044e-327f-4510-ae22-a6e7d61a6873" containerName="controller-manager" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.700679 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ea044e-327f-4510-ae22-a6e7d61a6873" containerName="controller-manager" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.700769 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ea044e-327f-4510-ae22-a6e7d61a6873" containerName="controller-manager" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.701116 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.715009 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.757550 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44544: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.757877 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-proxy-ca-bundles\") pod \"26ea044e-327f-4510-ae22-a6e7d61a6873\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.757929 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-client-ca\") pod \"26ea044e-327f-4510-ae22-a6e7d61a6873\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.757948 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55t2b\" (UniqueName: \"kubernetes.io/projected/b15e4920-ccda-4486-84ea-f48a51517d73-kube-api-access-55t2b\") pod \"b15e4920-ccda-4486-84ea-f48a51517d73\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758125 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-config\") pod \"b15e4920-ccda-4486-84ea-f48a51517d73\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758157 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-client-ca\") pod \"b15e4920-ccda-4486-84ea-f48a51517d73\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758202 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ea044e-327f-4510-ae22-a6e7d61a6873-serving-cert\") pod \"26ea044e-327f-4510-ae22-a6e7d61a6873\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758220 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15e4920-ccda-4486-84ea-f48a51517d73-serving-cert\") pod \"b15e4920-ccda-4486-84ea-f48a51517d73\" (UID: \"b15e4920-ccda-4486-84ea-f48a51517d73\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758239 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-config\") pod \"26ea044e-327f-4510-ae22-a6e7d61a6873\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758267 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb7r6\" (UniqueName: \"kubernetes.io/projected/26ea044e-327f-4510-ae22-a6e7d61a6873-kube-api-access-xb7r6\") pod \"26ea044e-327f-4510-ae22-a6e7d61a6873\" (UID: \"26ea044e-327f-4510-ae22-a6e7d61a6873\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758396 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-proxy-ca-bundles\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758407 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d985b9fd6-2zlh6"] Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758458 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9chb\" (UniqueName: \"kubernetes.io/projected/edf043c0-bbd9-4411-a187-872e252bb850-kube-api-access-b9chb\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758529 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf043c0-bbd9-4411-a187-872e252bb850-serving-cert\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758548 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-client-ca\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758572 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.758596 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-config\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.759819 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-config" (OuterVolumeSpecName: "config") pod "26ea044e-327f-4510-ae22-a6e7d61a6873" (UID: "26ea044e-327f-4510-ae22-a6e7d61a6873"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.759904 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-client-ca" (OuterVolumeSpecName: "client-ca") pod "b15e4920-ccda-4486-84ea-f48a51517d73" (UID: "b15e4920-ccda-4486-84ea-f48a51517d73"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.760434 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-client-ca" (OuterVolumeSpecName: "client-ca") pod "26ea044e-327f-4510-ae22-a6e7d61a6873" (UID: "26ea044e-327f-4510-ae22-a6e7d61a6873"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.760811 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "26ea044e-327f-4510-ae22-a6e7d61a6873" (UID: "26ea044e-327f-4510-ae22-a6e7d61a6873"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.765398 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ea044e-327f-4510-ae22-a6e7d61a6873-kube-api-access-xb7r6" (OuterVolumeSpecName: "kube-api-access-xb7r6") pod "26ea044e-327f-4510-ae22-a6e7d61a6873" (UID: "26ea044e-327f-4510-ae22-a6e7d61a6873"). InnerVolumeSpecName "kube-api-access-xb7r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.765778 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.265765291 +0000 UTC m=+223.764347316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.769444 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ea044e-327f-4510-ae22-a6e7d61a6873-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "26ea044e-327f-4510-ae22-a6e7d61a6873" (UID: "26ea044e-327f-4510-ae22-a6e7d61a6873"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.778718 4725 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6trwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.778771 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" podUID="24bebe29-933d-4461-8aab-b7d17e815781" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.779343 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-config" (OuterVolumeSpecName: "config") pod "b15e4920-ccda-4486-84ea-f48a51517d73" (UID: "b15e4920-ccda-4486-84ea-f48a51517d73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.783584 4725 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zxhvz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.783633 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" podUID="3a1d826c-e67e-4932-ab2c-41e53f848529" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.784520 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15e4920-ccda-4486-84ea-f48a51517d73-kube-api-access-55t2b" (OuterVolumeSpecName: "kube-api-access-55t2b") pod "b15e4920-ccda-4486-84ea-f48a51517d73" (UID: "b15e4920-ccda-4486-84ea-f48a51517d73"). InnerVolumeSpecName "kube-api-access-55t2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.784930 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15e4920-ccda-4486-84ea-f48a51517d73-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b15e4920-ccda-4486-84ea-f48a51517d73" (UID: "b15e4920-ccda-4486-84ea-f48a51517d73"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.867477 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.870364 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-proxy-ca-bundles\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.870483 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9chb\" (UniqueName: \"kubernetes.io/projected/edf043c0-bbd9-4411-a187-872e252bb850-kube-api-access-b9chb\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.871934 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.371914369 +0000 UTC m=+223.870496404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.872036 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf043c0-bbd9-4411-a187-872e252bb850-serving-cert\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.872088 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-client-ca\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.872149 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.872203 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-config\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.873574 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-config\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.873620 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ea044e-327f-4510-ae22-a6e7d61a6873-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.881843 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-proxy-ca-bundles\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.881939 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15e4920-ccda-4486-84ea-f48a51517d73-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.882539 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-client-ca\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.892900 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf043c0-bbd9-4411-a187-872e252bb850-serving-cert\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: E0225 10:56:37.893204 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.393187596 +0000 UTC m=+223.891769611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.893430 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.893488 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb7r6\" (UniqueName: \"kubernetes.io/projected/26ea044e-327f-4510-ae22-a6e7d61a6873-kube-api-access-xb7r6\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.962217 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.962498 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26ea044e-327f-4510-ae22-a6e7d61a6873-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.962512 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55t2b\" (UniqueName: \"kubernetes.io/projected/b15e4920-ccda-4486-84ea-f48a51517d73-kube-api-access-55t2b\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.962531 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.962542 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b15e4920-ccda-4486-84ea-f48a51517d73-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.911737 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.911604 4725 generic.go:334] "Generic (PLEG): container finished" podID="26ea044e-327f-4510-ae22-a6e7d61a6873" containerID="6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a" exitCode=0 Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.911638 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" event={"ID":"26ea044e-327f-4510-ae22-a6e7d61a6873","Type":"ContainerDied","Data":"6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a"} Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.963144 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nrlgl" event={"ID":"26ea044e-327f-4510-ae22-a6e7d61a6873","Type":"ContainerDied","Data":"07fb12677a4eb51ca5f6dc3a21766d9babc58f68a8d20c668a2e5da09ae80765"} Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.963188 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjxjp"] Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.963218 4725 scope.go:117] "RemoveContainer" containerID="6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.971271 4725 generic.go:334] "Generic (PLEG): container finished" podID="b15e4920-ccda-4486-84ea-f48a51517d73" containerID="660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12" exitCode=0 Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.972896 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.975630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" event={"ID":"b15e4920-ccda-4486-84ea-f48a51517d73","Type":"ContainerDied","Data":"660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12"} Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.976210 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr" event={"ID":"b15e4920-ccda-4486-84ea-f48a51517d73","Type":"ContainerDied","Data":"73cb9bea73f809ee4bba4c5ff8c6432ed9cb08212c5f372b53e998861287d035"} Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.977982 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44556: no serving certificate available for the kubelet" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.978963 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9chb\" (UniqueName: \"kubernetes.io/projected/edf043c0-bbd9-4411-a187-872e252bb850-kube-api-access-b9chb\") pod \"controller-manager-d985b9fd6-2zlh6\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.979886 4725 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7624 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.979926 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.982538 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:37 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:37 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:37 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.982574 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:37 crc kubenswrapper[4725]: I0225 10:56:37.987471 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.005648 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2tdp"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.019005 4725 scope.go:117] "RemoveContainer" containerID="6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.038776 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a\": container with ID starting with 6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a not found: ID does not exist" containerID="6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.038819 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a"} err="failed to get container status \"6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a\": rpc error: code = NotFound desc = could not find container \"6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a\": container with ID starting with 6dc89d2aca73b13565c15b4951ad83171fffda038527641bccc81b1b971f6c3a not found: ID does not exist" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.038863 4725 scope.go:117] "RemoveContainer" containerID="660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.073197 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.077882 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-77cqr"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.084023 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nrlgl"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.086609 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nrlgl"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.088239 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.088596 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.090729 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.590709801 +0000 UTC m=+224.089291826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.091330 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.091677 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.591660315 +0000 UTC m=+224.090242410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.137776 4725 scope.go:117] "RemoveContainer" containerID="660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.142159 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12\": container with ID starting with 660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12 not found: ID does not exist" containerID="660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.142218 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12"} err="failed to get container status \"660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12\": rpc error: code = NotFound desc = could not find container \"660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12\": container with ID starting with 660bf336f0f2800f036bcfa33686df3ec68f7a90c78a786b01bad4d49530ff12 not found: ID does not exist" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.194184 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.194570 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.694542309 +0000 UTC m=+224.193124334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.226842 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxhvz" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.291698 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dcstn"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.296639 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.297084 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.797069414 +0000 UTC m=+224.295651439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.366878 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bq27c"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.397409 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.397909 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:38.897890124 +0000 UTC m=+224.396472149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.500135 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.500592 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.000576923 +0000 UTC m=+224.499158948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.555289 4725 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.607837 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.608266 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.108234149 +0000 UTC m=+224.606816174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.608352 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.608754 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.108746372 +0000 UTC m=+224.607328397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.649651 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44570: no serving certificate available for the kubelet" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.709668 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.709786 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.209766208 +0000 UTC m=+224.708348233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.709873 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.710478 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.210468706 +0000 UTC m=+224.709050731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.751518 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d985b9fd6-2zlh6"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.810982 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.811136 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.311112222 +0000 UTC m=+224.809694247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.811280 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.811588 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.311575804 +0000 UTC m=+224.810157829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.853803 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:38 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:38 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:38 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.854212 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.855520 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6c8m5"] Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.856074 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15e4920-ccda-4486-84ea-f48a51517d73" containerName="route-controller-manager" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.856099 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15e4920-ccda-4486-84ea-f48a51517d73" containerName="route-controller-manager" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.856239 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15e4920-ccda-4486-84ea-f48a51517d73" containerName="route-controller-manager" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.857981 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.859879 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.872467 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c8m5"] Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.911882 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.912065 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkj27\" (UniqueName: \"kubernetes.io/projected/34091911-8e18-4a85-b0c2-a07e3c1a7e28-kube-api-access-wkj27\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.912109 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-utilities\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.912177 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-catalog-content\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:38 crc kubenswrapper[4725]: E0225 10:56:38.912280 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.412263921 +0000 UTC m=+224.910845936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.986264 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" event={"ID":"edf043c0-bbd9-4411-a187-872e252bb850","Type":"ContainerStarted","Data":"ace3400106919f4a5300b9895bdc361a296c70f374443473a79b0f23753bfa50"} Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.986364 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" event={"ID":"edf043c0-bbd9-4411-a187-872e252bb850","Type":"ContainerStarted","Data":"d2e257a106f00f95ae13d931857baf759428fb7263964fa82fb6d152d39d22d4"} Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.987709 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.989979 4725 generic.go:334] "Generic (PLEG): container finished" podID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerID="a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a" exitCode=0 Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.990052 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dcstn" event={"ID":"47446d07-b5cf-4646-b54b-0e841fb3a662","Type":"ContainerDied","Data":"a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a"} Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.990075 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dcstn" event={"ID":"47446d07-b5cf-4646-b54b-0e841fb3a662","Type":"ContainerStarted","Data":"2b697a7684e1e9acaba2496de7c01c17d5bf55cf5c5dff2c730a875468ffa4e0"} Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.991803 4725 patch_prober.go:28] interesting pod/controller-manager-d985b9fd6-2zlh6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.991846 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" podUID="edf043c0-bbd9-4411-a187-872e252bb850" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.992505 4725 generic.go:334] "Generic (PLEG): container finished" podID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerID="42fe0e5d5dc915f4da6c822faad13fa6c4c2db0c74f8f862a8cc877c89ed20c6" exitCode=0 Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.992546 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bq27c" event={"ID":"dec8f4b6-001e-4ce7-b6d4-55b197612a38","Type":"ContainerDied","Data":"42fe0e5d5dc915f4da6c822faad13fa6c4c2db0c74f8f862a8cc877c89ed20c6"} Feb 25 10:56:38 crc kubenswrapper[4725]: I0225 10:56:38.992562 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bq27c" event={"ID":"dec8f4b6-001e-4ce7-b6d4-55b197612a38","Type":"ContainerStarted","Data":"1519413a92799634ca8955c05617980b1b05090db7d4a10e7ee7a9642b3f5b7f"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.002994 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tdp" event={"ID":"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d","Type":"ContainerDied","Data":"c6a64853a4a31dcea88de6c448cef25fe7aa5ab333229a9d1cce19e5f6b6f030"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.001936 4725 generic.go:334] "Generic (PLEG): container finished" podID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerID="c6a64853a4a31dcea88de6c448cef25fe7aa5ab333229a9d1cce19e5f6b6f030" exitCode=0 Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.003310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tdp" event={"ID":"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d","Type":"ContainerStarted","Data":"f0175480ac8d60fb63694937c29067d5d646a5245d2f5865cdf1ea4cc93d60db"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.004654 4725 generic.go:334] "Generic (PLEG): container finished" podID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerID="92893cf08b177659fe7f7f5c0824254848767f650b0b5de5b00bf6bebadc7ef4" exitCode=0 Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.004752 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjxjp" event={"ID":"8f0d98c3-7ffa-4029-ab5c-c252062b3099","Type":"ContainerDied","Data":"92893cf08b177659fe7f7f5c0824254848767f650b0b5de5b00bf6bebadc7ef4"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.004792 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjxjp" event={"ID":"8f0d98c3-7ffa-4029-ab5c-c252062b3099","Type":"ContainerStarted","Data":"085fe9f2cc9986df50f2b1b381ed651a74148caa28c3f7c0c1a9fffce075d7d6"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.015632 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-catalog-content\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.015735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkj27\" (UniqueName: \"kubernetes.io/projected/34091911-8e18-4a85-b0c2-a07e3c1a7e28-kube-api-access-wkj27\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.015976 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.016009 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-utilities\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.018219 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" event={"ID":"de7222c9-af96-4a59-9188-b53187f1cbe3","Type":"ContainerStarted","Data":"d8f4241fd4722390755190ed96c7b513ade7851bf47de75b1d3a4a66a53bf6fa"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.018252 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" event={"ID":"de7222c9-af96-4a59-9188-b53187f1cbe3","Type":"ContainerStarted","Data":"178447d431340eed6f0d954fedd0fb651d71ffb1b46f47ec78e5b1a86f46a017"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.018262 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" event={"ID":"de7222c9-af96-4a59-9188-b53187f1cbe3","Type":"ContainerStarted","Data":"1f329d56980e45e77b3eca8b38e95b35b89508efed5b5dd280cb958240b23afb"} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.026853 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-catalog-content\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.027564 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-utilities\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:39 crc kubenswrapper[4725]: E0225 10:56:39.027966 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.527943994 +0000 UTC m=+225.026526019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.032699 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" podStartSLOduration=5.032680895 podStartE2EDuration="5.032680895s" podCreationTimestamp="2026-02-25 10:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:39.010364352 +0000 UTC m=+224.508946387" watchObservedRunningTime="2026-02-25 10:56:39.032680895 +0000 UTC m=+224.531262920" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.053322 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkj27\" (UniqueName: \"kubernetes.io/projected/34091911-8e18-4a85-b0c2-a07e3c1a7e28-kube-api-access-wkj27\") pod \"redhat-marketplace-6c8m5\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.104223 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ql8k8" podStartSLOduration=11.104203103 podStartE2EDuration="11.104203103s" podCreationTimestamp="2026-02-25 10:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:39.101850593 +0000 UTC m=+224.600432638" watchObservedRunningTime="2026-02-25 10:56:39.104203103 +0000 UTC m=+224.602785138" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.117028 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:39 crc kubenswrapper[4725]: E0225 10:56:39.118024 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.617997508 +0000 UTC m=+225.116579543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.148630 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.163579 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.163711 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.168102 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.168307 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.201192 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.219955 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.220037 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.220146 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: E0225 10:56:39.220590 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.720572523 +0000 UTC m=+225.219154548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.256269 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ea044e-327f-4510-ae22-a6e7d61a6873" path="/var/lib/kubelet/pods/26ea044e-327f-4510-ae22-a6e7d61a6873/volumes" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.256887 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15e4920-ccda-4486-84ea-f48a51517d73" path="/var/lib/kubelet/pods/b15e4920-ccda-4486-84ea-f48a51517d73/volumes" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.269247 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7mt"] Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.270524 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.295500 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7mt"] Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.323505 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.323730 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.323795 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-catalog-content\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.323847 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-utilities\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.323878 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.323944 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b99hf\" (UniqueName: \"kubernetes.io/projected/8817d816-5958-4498-8a0d-528952c47e3a-kube-api-access-b99hf\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: E0225 10:56:39.324101 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.824082823 +0000 UTC m=+225.322664848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.324422 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.365625 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.425935 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.425990 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-catalog-content\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.426036 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-utilities\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.426106 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b99hf\" (UniqueName: \"kubernetes.io/projected/8817d816-5958-4498-8a0d-528952c47e3a-kube-api-access-b99hf\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: E0225 10:56:39.426388 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 10:56:39.926366561 +0000 UTC m=+225.424948656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dpmr4" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.426520 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-catalog-content\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.426660 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-utilities\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.454418 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b99hf\" (UniqueName: \"kubernetes.io/projected/8817d816-5958-4498-8a0d-528952c47e3a-kube-api-access-b99hf\") pod \"redhat-marketplace-gx7mt\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.476139 4725 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-25T10:56:38.555586096Z","Handler":null,"Name":""} Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.486394 4725 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.486436 4725 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.527070 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.536679 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.547622 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c8m5"] Feb 25 10:56:39 crc kubenswrapper[4725]: W0225 10:56:39.558025 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34091911_8e18_4a85_b0c2_a07e3c1a7e28.slice/crio-5b639b012f956421200bdca7ca123ac62175d18af5b775aadc6c6612cb4903b8 WatchSource:0}: Error finding container 5b639b012f956421200bdca7ca123ac62175d18af5b775aadc6c6612cb4903b8: Status 404 returned error can't find the container with id 5b639b012f956421200bdca7ca123ac62175d18af5b775aadc6c6612cb4903b8 Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.566100 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.602417 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.632596 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.655786 4725 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.655843 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.722036 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dpmr4\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.857282 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.871642 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:39 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:39 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:39 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.871694 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.885601 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t54lf"] Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.887335 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t54lf"] Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.887421 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.890554 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.958346 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk7r6\" (UniqueName: \"kubernetes.io/projected/85249796-156c-4e21-81ee-d4cca9c8a607-kube-api-access-sk7r6\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.958645 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-utilities\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:39 crc kubenswrapper[4725]: I0225 10:56:39.958673 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-catalog-content\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.024694 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44578: no serving certificate available for the kubelet" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.045260 4725 generic.go:334] "Generic (PLEG): container finished" podID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerID="f85db8f74d363c09e1852d5286b16203f0dd9993771eb2931945dad3ff8edd43" exitCode=0 Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.046785 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c8m5" event={"ID":"34091911-8e18-4a85-b0c2-a07e3c1a7e28","Type":"ContainerDied","Data":"f85db8f74d363c09e1852d5286b16203f0dd9993771eb2931945dad3ff8edd43"} Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.046991 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c8m5" event={"ID":"34091911-8e18-4a85-b0c2-a07e3c1a7e28","Type":"ContainerStarted","Data":"5b639b012f956421200bdca7ca123ac62175d18af5b775aadc6c6612cb4903b8"} Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.059691 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-utilities\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.059750 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-catalog-content\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.059913 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk7r6\" (UniqueName: \"kubernetes.io/projected/85249796-156c-4e21-81ee-d4cca9c8a607-kube-api-access-sk7r6\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.061086 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-utilities\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.061370 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-catalog-content\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.073070 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.091389 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk7r6\" (UniqueName: \"kubernetes.io/projected/85249796-156c-4e21-81ee-d4cca9c8a607-kube-api-access-sk7r6\") pod \"redhat-operators-t54lf\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.261158 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn"] Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.261780 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.285443 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.285699 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.285817 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.285942 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.286226 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.286418 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.291928 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n87p9"] Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.293208 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.294322 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn"] Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.299319 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.307202 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n87p9"] Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.336652 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7mt"] Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.371292 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-459q4\" (UniqueName: \"kubernetes.io/projected/54900028-6d3f-4515-8262-f75588f98fb6-kube-api-access-459q4\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.371374 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-catalog-content\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.371418 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-utilities\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.371454 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcj2n\" (UniqueName: \"kubernetes.io/projected/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-kube-api-access-qcj2n\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.371511 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54900028-6d3f-4515-8262-f75588f98fb6-serving-cert\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.371561 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-config\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.371580 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-client-ca\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.410002 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 10:56:40 crc kubenswrapper[4725]: W0225 10:56:40.423074 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod619d4f2a_3a7b_45ee_aa3b_e6106371889d.slice/crio-c267d0c4ff1679f5e4bc4b0a5cbccfd122e05b407479e30c94efb12fd5ce489e WatchSource:0}: Error finding container c267d0c4ff1679f5e4bc4b0a5cbccfd122e05b407479e30c94efb12fd5ce489e: Status 404 returned error can't find the container with id c267d0c4ff1679f5e4bc4b0a5cbccfd122e05b407479e30c94efb12fd5ce489e Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.473197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54900028-6d3f-4515-8262-f75588f98fb6-serving-cert\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.473444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-client-ca\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.473660 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-config\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.473959 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-459q4\" (UniqueName: \"kubernetes.io/projected/54900028-6d3f-4515-8262-f75588f98fb6-kube-api-access-459q4\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.474465 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-catalog-content\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.477222 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-utilities\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.477285 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcj2n\" (UniqueName: \"kubernetes.io/projected/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-kube-api-access-qcj2n\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.478461 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-client-ca\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.477186 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-config\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.475693 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-catalog-content\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.478934 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-utilities\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.480600 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54900028-6d3f-4515-8262-f75588f98fb6-serving-cert\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.504989 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpmr4"] Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.520864 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-459q4\" (UniqueName: \"kubernetes.io/projected/54900028-6d3f-4515-8262-f75588f98fb6-kube-api-access-459q4\") pod \"route-controller-manager-589b8f796-p5zrn\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.529241 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcj2n\" (UniqueName: \"kubernetes.io/projected/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-kube-api-access-qcj2n\") pod \"redhat-operators-n87p9\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: W0225 10:56:40.540718 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d5eb9c_abf6_4d9c_ba1f_4a78324d5519.slice/crio-ae4fad6bd8a056d20271b3d52d84f9c0a55b466bc54caaa65f0302d7bfc3dbb9 WatchSource:0}: Error finding container ae4fad6bd8a056d20271b3d52d84f9c0a55b466bc54caaa65f0302d7bfc3dbb9: Status 404 returned error can't find the container with id ae4fad6bd8a056d20271b3d52d84f9c0a55b466bc54caaa65f0302d7bfc3dbb9 Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.581010 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.581650 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.595971 4725 patch_prober.go:28] interesting pod/console-f9d7485db-f4l29 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.596023 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-f4l29" podUID="dcf8d8d2-144e-4232-bd68-b14a9f178c7d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.640521 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.709428 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.852428 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:40 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:40 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:40 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.852796 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:40 crc kubenswrapper[4725]: I0225 10:56:40.922022 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t54lf"] Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.064577 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"619d4f2a-3a7b-45ee-aa3b-e6106371889d","Type":"ContainerStarted","Data":"c267d0c4ff1679f5e4bc4b0a5cbccfd122e05b407479e30c94efb12fd5ce489e"} Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.072012 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" event={"ID":"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519","Type":"ContainerStarted","Data":"8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7"} Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.072051 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" event={"ID":"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519","Type":"ContainerStarted","Data":"ae4fad6bd8a056d20271b3d52d84f9c0a55b466bc54caaa65f0302d7bfc3dbb9"} Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.072103 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.075104 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t54lf" event={"ID":"85249796-156c-4e21-81ee-d4cca9c8a607","Type":"ContainerStarted","Data":"391bc5dcdb87e8ef5d9001f5f2d5e6375d4a0c4565db01a82eb4a779e5407d9b"} Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.099042 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" podStartSLOduration=172.09902511 podStartE2EDuration="2m52.09902511s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:41.098965259 +0000 UTC m=+226.597547304" watchObservedRunningTime="2026-02-25 10:56:41.09902511 +0000 UTC m=+226.597607155" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.099155 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-9p4cm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.099484 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9p4cm" podUID="fb51f87b-5859-44b4-ae55-c4f11ed0237b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.099344 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-9p4cm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.099994 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9p4cm" podUID="fb51f87b-5859-44b4-ae55-c4f11ed0237b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.133910 4725 generic.go:334] "Generic (PLEG): container finished" podID="8817d816-5958-4498-8a0d-528952c47e3a" containerID="c7f8d31c04d7b33f5fc901be77864ef89d7f7eb17274e82626f90ad69d6fce5e" exitCode=0 Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.134380 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7mt" event={"ID":"8817d816-5958-4498-8a0d-528952c47e3a","Type":"ContainerDied","Data":"c7f8d31c04d7b33f5fc901be77864ef89d7f7eb17274e82626f90ad69d6fce5e"} Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.134435 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7mt" event={"ID":"8817d816-5958-4498-8a0d-528952c47e3a","Type":"ContainerStarted","Data":"0bf8fb6d6399bf0bd81e2edd0e4c16c2d15baa064bd7336768739c3708859e01"} Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.238903 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.295145 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn"] Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.399621 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n87p9"] Feb 25 10:56:41 crc kubenswrapper[4725]: W0225 10:56:41.494139 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc934ca68_7c23_4a8f_8e09_8d3edad1e1a5.slice/crio-a640e3a88556dc6a44446a0281d8f17961ccf83cc5b57cd13c94fbd72e70f8a9 WatchSource:0}: Error finding container a640e3a88556dc6a44446a0281d8f17961ccf83cc5b57cd13c94fbd72e70f8a9: Status 404 returned error can't find the container with id a640e3a88556dc6a44446a0281d8f17961ccf83cc5b57cd13c94fbd72e70f8a9 Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.556872 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.556982 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.771916 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.847839 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.858133 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:41 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:41 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:41 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.858210 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:41 crc kubenswrapper[4725]: I0225 10:56:41.941209 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.031782 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.042599 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qsb7p" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.192803 4725 generic.go:334] "Generic (PLEG): container finished" podID="619d4f2a-3a7b-45ee-aa3b-e6106371889d" containerID="199f5e924fad8887b9c257fa0d8269cbef0df61f75b4172b64d5cf6ce50fc004" exitCode=0 Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.192896 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"619d4f2a-3a7b-45ee-aa3b-e6106371889d","Type":"ContainerDied","Data":"199f5e924fad8887b9c257fa0d8269cbef0df61f75b4172b64d5cf6ce50fc004"} Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.197028 4725 generic.go:334] "Generic (PLEG): container finished" podID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerID="e42f7faa476af5b1ba7fa3576b326f62838eefccb9017c95f10783b7498fbe06" exitCode=0 Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.197090 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n87p9" event={"ID":"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5","Type":"ContainerDied","Data":"e42f7faa476af5b1ba7fa3576b326f62838eefccb9017c95f10783b7498fbe06"} Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.197119 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n87p9" event={"ID":"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5","Type":"ContainerStarted","Data":"a640e3a88556dc6a44446a0281d8f17961ccf83cc5b57cd13c94fbd72e70f8a9"} Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.201241 4725 generic.go:334] "Generic (PLEG): container finished" podID="85249796-156c-4e21-81ee-d4cca9c8a607" containerID="049aa5c689e2c3873b2df13fea4bc907c104665f81898fe7524eb7a4757a5cdb" exitCode=0 Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.201313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t54lf" event={"ID":"85249796-156c-4e21-81ee-d4cca9c8a607","Type":"ContainerDied","Data":"049aa5c689e2c3873b2df13fea4bc907c104665f81898fe7524eb7a4757a5cdb"} Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.206450 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" event={"ID":"54900028-6d3f-4515-8262-f75588f98fb6","Type":"ContainerStarted","Data":"ac8c54ffae9254f3c4b9ece14cc1eb86726ce57e8108a6c1f4e2599dc9288e76"} Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.206484 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.206493 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" event={"ID":"54900028-6d3f-4515-8262-f75588f98fb6","Type":"ContainerStarted","Data":"8a358d8a9ea1fd6c8989bb73479ca4c5dbdafb6da9cbe1e46d8f3d43512ff461"} Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.278311 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" podStartSLOduration=8.278290452 podStartE2EDuration="8.278290452s" podCreationTimestamp="2026-02-25 10:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:56:42.278139618 +0000 UTC m=+227.776721643" watchObservedRunningTime="2026-02-25 10:56:42.278290452 +0000 UTC m=+227.776872487" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.442691 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.443503 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.447966 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.448244 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.455852 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.625112 4725 ???:1] "http: TLS handshake error from 192.168.126.11:44590: no serving certificate available for the kubelet" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.646699 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26101d5a-aa07-43f8-b690-e56e64f69479-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.646815 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26101d5a-aa07-43f8-b690-e56e64f69479-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.650296 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.747344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26101d5a-aa07-43f8-b690-e56e64f69479-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.747490 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26101d5a-aa07-43f8-b690-e56e64f69479-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.748001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26101d5a-aa07-43f8-b690-e56e64f69479-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.772173 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26101d5a-aa07-43f8-b690-e56e64f69479-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.849755 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:42 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:42 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:42 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:42 crc kubenswrapper[4725]: I0225 10:56:42.849808 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.066725 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.241285 4725 generic.go:334] "Generic (PLEG): container finished" podID="08fe5978-cb79-459f-b51a-b8f769ea177f" containerID="9efa1097b38368bb85aa4b081c9f8cb61478e622441ab1602bfa0088065f26de" exitCode=0 Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.241680 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" event={"ID":"08fe5978-cb79-459f-b51a-b8f769ea177f","Type":"ContainerDied","Data":"9efa1097b38368bb85aa4b081c9f8cb61478e622441ab1602bfa0088065f26de"} Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.443097 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.633038 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.818497 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kubelet-dir\") pod \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.818849 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kube-api-access\") pod \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\" (UID: \"619d4f2a-3a7b-45ee-aa3b-e6106371889d\") " Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.819761 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "619d4f2a-3a7b-45ee-aa3b-e6106371889d" (UID: "619d4f2a-3a7b-45ee-aa3b-e6106371889d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.827047 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "619d4f2a-3a7b-45ee-aa3b-e6106371889d" (UID: "619d4f2a-3a7b-45ee-aa3b-e6106371889d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.853238 4725 patch_prober.go:28] interesting pod/router-default-5444994796-7lb6x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 10:56:43 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 25 10:56:43 crc kubenswrapper[4725]: [+]process-running ok Feb 25 10:56:43 crc kubenswrapper[4725]: healthz check failed Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.853301 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7lb6x" podUID="6199f7d7-c530-47d4-8cb6-1526dcba2266" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.919607 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:43 crc kubenswrapper[4725]: I0225 10:56:43.919642 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/619d4f2a-3a7b-45ee-aa3b-e6106371889d-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.266428 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"619d4f2a-3a7b-45ee-aa3b-e6106371889d","Type":"ContainerDied","Data":"c267d0c4ff1679f5e4bc4b0a5cbccfd122e05b407479e30c94efb12fd5ce489e"} Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.266496 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c267d0c4ff1679f5e4bc4b0a5cbccfd122e05b407479e30c94efb12fd5ce489e" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.266584 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.278218 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"26101d5a-aa07-43f8-b690-e56e64f69479","Type":"ContainerStarted","Data":"be1c5119054045f50223e1af7bd965e3e365c60020a3d45f5a47408bf546480d"} Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.727230 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.745724 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08fe5978-cb79-459f-b51a-b8f769ea177f-config-volume\") pod \"08fe5978-cb79-459f-b51a-b8f769ea177f\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.745781 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7z8h\" (UniqueName: \"kubernetes.io/projected/08fe5978-cb79-459f-b51a-b8f769ea177f-kube-api-access-t7z8h\") pod \"08fe5978-cb79-459f-b51a-b8f769ea177f\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.745887 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08fe5978-cb79-459f-b51a-b8f769ea177f-secret-volume\") pod \"08fe5978-cb79-459f-b51a-b8f769ea177f\" (UID: \"08fe5978-cb79-459f-b51a-b8f769ea177f\") " Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.747923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08fe5978-cb79-459f-b51a-b8f769ea177f-config-volume" (OuterVolumeSpecName: "config-volume") pod "08fe5978-cb79-459f-b51a-b8f769ea177f" (UID: "08fe5978-cb79-459f-b51a-b8f769ea177f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.754706 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fe5978-cb79-459f-b51a-b8f769ea177f-kube-api-access-t7z8h" (OuterVolumeSpecName: "kube-api-access-t7z8h") pod "08fe5978-cb79-459f-b51a-b8f769ea177f" (UID: "08fe5978-cb79-459f-b51a-b8f769ea177f"). InnerVolumeSpecName "kube-api-access-t7z8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.783264 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe5978-cb79-459f-b51a-b8f769ea177f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "08fe5978-cb79-459f-b51a-b8f769ea177f" (UID: "08fe5978-cb79-459f-b51a-b8f769ea177f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.850659 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08fe5978-cb79-459f-b51a-b8f769ea177f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.850698 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7z8h\" (UniqueName: \"kubernetes.io/projected/08fe5978-cb79-459f-b51a-b8f769ea177f-kube-api-access-t7z8h\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.850713 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08fe5978-cb79-459f-b51a-b8f769ea177f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.855360 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:44 crc kubenswrapper[4725]: I0225 10:56:44.857597 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7lb6x" Feb 25 10:56:45 crc kubenswrapper[4725]: I0225 10:56:45.297630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" event={"ID":"08fe5978-cb79-459f-b51a-b8f769ea177f","Type":"ContainerDied","Data":"c2ab7b9ad8e452c921bc3ec6ba0a7db5c7b6a271d4a401ab7492d483593f3ede"} Feb 25 10:56:45 crc kubenswrapper[4725]: I0225 10:56:45.297667 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ab7b9ad8e452c921bc3ec6ba0a7db5c7b6a271d4a401ab7492d483593f3ede" Feb 25 10:56:45 crc kubenswrapper[4725]: I0225 10:56:45.297648 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l" Feb 25 10:56:45 crc kubenswrapper[4725]: I0225 10:56:45.303070 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"26101d5a-aa07-43f8-b690-e56e64f69479","Type":"ContainerStarted","Data":"f16d41f77a5242895ed9b190aa8e8eca0d6316accfb70ed4c85f53b2166838b4"} Feb 25 10:56:46 crc kubenswrapper[4725]: I0225 10:56:46.312075 4725 generic.go:334] "Generic (PLEG): container finished" podID="26101d5a-aa07-43f8-b690-e56e64f69479" containerID="f16d41f77a5242895ed9b190aa8e8eca0d6316accfb70ed4c85f53b2166838b4" exitCode=0 Feb 25 10:56:46 crc kubenswrapper[4725]: I0225 10:56:46.312336 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"26101d5a-aa07-43f8-b690-e56e64f69479","Type":"ContainerDied","Data":"f16d41f77a5242895ed9b190aa8e8eca0d6316accfb70ed4c85f53b2166838b4"} Feb 25 10:56:46 crc kubenswrapper[4725]: I0225 10:56:46.568020 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zszmh" Feb 25 10:56:46 crc kubenswrapper[4725]: I0225 10:56:46.573156 4725 ???:1] "http: TLS handshake error from 192.168.126.11:58882: no serving certificate available for the kubelet" Feb 25 10:56:47 crc kubenswrapper[4725]: I0225 10:56:47.771395 4725 ???:1] "http: TLS handshake error from 192.168.126.11:58884: no serving certificate available for the kubelet" Feb 25 10:56:50 crc kubenswrapper[4725]: I0225 10:56:50.703675 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:50 crc kubenswrapper[4725]: I0225 10:56:50.707989 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 10:56:51 crc kubenswrapper[4725]: I0225 10:56:51.111926 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9p4cm" Feb 25 10:56:54 crc kubenswrapper[4725]: I0225 10:56:54.005341 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d985b9fd6-2zlh6"] Feb 25 10:56:54 crc kubenswrapper[4725]: I0225 10:56:54.006003 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" podUID="edf043c0-bbd9-4411-a187-872e252bb850" containerName="controller-manager" containerID="cri-o://ace3400106919f4a5300b9895bdc361a296c70f374443473a79b0f23753bfa50" gracePeriod=30 Feb 25 10:56:54 crc kubenswrapper[4725]: I0225 10:56:54.008679 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn"] Feb 25 10:56:54 crc kubenswrapper[4725]: I0225 10:56:54.008856 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" podUID="54900028-6d3f-4515-8262-f75588f98fb6" containerName="route-controller-manager" containerID="cri-o://ac8c54ffae9254f3c4b9ece14cc1eb86726ce57e8108a6c1f4e2599dc9288e76" gracePeriod=30 Feb 25 10:56:55 crc kubenswrapper[4725]: I0225 10:56:55.374621 4725 generic.go:334] "Generic (PLEG): container finished" podID="edf043c0-bbd9-4411-a187-872e252bb850" containerID="ace3400106919f4a5300b9895bdc361a296c70f374443473a79b0f23753bfa50" exitCode=0 Feb 25 10:56:55 crc kubenswrapper[4725]: I0225 10:56:55.374749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" event={"ID":"edf043c0-bbd9-4411-a187-872e252bb850","Type":"ContainerDied","Data":"ace3400106919f4a5300b9895bdc361a296c70f374443473a79b0f23753bfa50"} Feb 25 10:56:55 crc kubenswrapper[4725]: I0225 10:56:55.377800 4725 generic.go:334] "Generic (PLEG): container finished" podID="54900028-6d3f-4515-8262-f75588f98fb6" containerID="ac8c54ffae9254f3c4b9ece14cc1eb86726ce57e8108a6c1f4e2599dc9288e76" exitCode=0 Feb 25 10:56:55 crc kubenswrapper[4725]: I0225 10:56:55.377911 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" event={"ID":"54900028-6d3f-4515-8262-f75588f98fb6","Type":"ContainerDied","Data":"ac8c54ffae9254f3c4b9ece14cc1eb86726ce57e8108a6c1f4e2599dc9288e76"} Feb 25 10:56:58 crc kubenswrapper[4725]: I0225 10:56:58.089286 4725 patch_prober.go:28] interesting pod/controller-manager-d985b9fd6-2zlh6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Feb 25 10:56:58 crc kubenswrapper[4725]: I0225 10:56:58.089548 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" podUID="edf043c0-bbd9-4411-a187-872e252bb850" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Feb 25 10:56:59 crc kubenswrapper[4725]: I0225 10:56:59.867785 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 10:57:00 crc kubenswrapper[4725]: I0225 10:57:00.642432 4725 patch_prober.go:28] interesting pod/route-controller-manager-589b8f796-p5zrn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Feb 25 10:57:00 crc kubenswrapper[4725]: I0225 10:57:00.643002 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" podUID="54900028-6d3f-4515-8262-f75588f98fb6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Feb 25 10:57:01 crc kubenswrapper[4725]: I0225 10:57:01.855048 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.023558 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26101d5a-aa07-43f8-b690-e56e64f69479-kubelet-dir\") pod \"26101d5a-aa07-43f8-b690-e56e64f69479\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.023634 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26101d5a-aa07-43f8-b690-e56e64f69479-kube-api-access\") pod \"26101d5a-aa07-43f8-b690-e56e64f69479\" (UID: \"26101d5a-aa07-43f8-b690-e56e64f69479\") " Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.023653 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26101d5a-aa07-43f8-b690-e56e64f69479-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "26101d5a-aa07-43f8-b690-e56e64f69479" (UID: "26101d5a-aa07-43f8-b690-e56e64f69479"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.024029 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26101d5a-aa07-43f8-b690-e56e64f69479-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.035234 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26101d5a-aa07-43f8-b690-e56e64f69479-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "26101d5a-aa07-43f8-b690-e56e64f69479" (UID: "26101d5a-aa07-43f8-b690-e56e64f69479"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.125726 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26101d5a-aa07-43f8-b690-e56e64f69479-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.420115 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"26101d5a-aa07-43f8-b690-e56e64f69479","Type":"ContainerDied","Data":"be1c5119054045f50223e1af7bd965e3e365c60020a3d45f5a47408bf546480d"} Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.420455 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be1c5119054045f50223e1af7bd965e3e365c60020a3d45f5a47408bf546480d" Feb 25 10:57:02 crc kubenswrapper[4725]: I0225 10:57:02.420161 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 10:57:04 crc kubenswrapper[4725]: E0225 10:57:04.049864 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 25 10:57:04 crc kubenswrapper[4725]: E0225 10:57:04.050323 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 10:57:04 crc kubenswrapper[4725]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 25 10:57:04 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ll8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29533616-zsh9g_openshift-infra(b0b17a01-64f4-4578-9e56-19825cfa713f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 25 10:57:04 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 10:57:04 crc kubenswrapper[4725]: E0225 10:57:04.051858 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" podUID="b0b17a01-64f4-4578-9e56-19825cfa713f" Feb 25 10:57:04 crc kubenswrapper[4725]: E0225 10:57:04.433701 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" podUID="b0b17a01-64f4-4578-9e56-19825cfa713f" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.698982 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.703889 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.730298 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-579444c495-dchx4"] Feb 25 10:57:06 crc kubenswrapper[4725]: E0225 10:57:06.730552 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54900028-6d3f-4515-8262-f75588f98fb6" containerName="route-controller-manager" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.730568 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="54900028-6d3f-4515-8262-f75588f98fb6" containerName="route-controller-manager" Feb 25 10:57:06 crc kubenswrapper[4725]: E0225 10:57:06.730580 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf043c0-bbd9-4411-a187-872e252bb850" containerName="controller-manager" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.730588 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf043c0-bbd9-4411-a187-872e252bb850" containerName="controller-manager" Feb 25 10:57:06 crc kubenswrapper[4725]: E0225 10:57:06.730609 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fe5978-cb79-459f-b51a-b8f769ea177f" containerName="collect-profiles" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.730618 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fe5978-cb79-459f-b51a-b8f769ea177f" containerName="collect-profiles" Feb 25 10:57:06 crc kubenswrapper[4725]: E0225 10:57:06.730633 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619d4f2a-3a7b-45ee-aa3b-e6106371889d" containerName="pruner" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.730641 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="619d4f2a-3a7b-45ee-aa3b-e6106371889d" containerName="pruner" Feb 25 10:57:06 crc kubenswrapper[4725]: E0225 10:57:06.730649 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26101d5a-aa07-43f8-b690-e56e64f69479" containerName="pruner" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.730657 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="26101d5a-aa07-43f8-b690-e56e64f69479" containerName="pruner" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.746072 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="26101d5a-aa07-43f8-b690-e56e64f69479" containerName="pruner" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.746105 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="619d4f2a-3a7b-45ee-aa3b-e6106371889d" containerName="pruner" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.746118 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fe5978-cb79-459f-b51a-b8f769ea177f" containerName="collect-profiles" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.746127 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="54900028-6d3f-4515-8262-f75588f98fb6" containerName="route-controller-manager" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.746182 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf043c0-bbd9-4411-a187-872e252bb850" containerName="controller-manager" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.748064 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-579444c495-dchx4"] Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.748192 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880229 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-459q4\" (UniqueName: \"kubernetes.io/projected/54900028-6d3f-4515-8262-f75588f98fb6-kube-api-access-459q4\") pod \"54900028-6d3f-4515-8262-f75588f98fb6\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880290 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf043c0-bbd9-4411-a187-872e252bb850-serving-cert\") pod \"edf043c0-bbd9-4411-a187-872e252bb850\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-config\") pod \"54900028-6d3f-4515-8262-f75588f98fb6\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880405 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-config\") pod \"edf043c0-bbd9-4411-a187-872e252bb850\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880443 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54900028-6d3f-4515-8262-f75588f98fb6-serving-cert\") pod \"54900028-6d3f-4515-8262-f75588f98fb6\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880464 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-client-ca\") pod \"edf043c0-bbd9-4411-a187-872e252bb850\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880494 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-client-ca\") pod \"54900028-6d3f-4515-8262-f75588f98fb6\" (UID: \"54900028-6d3f-4515-8262-f75588f98fb6\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880528 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-proxy-ca-bundles\") pod \"edf043c0-bbd9-4411-a187-872e252bb850\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880570 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9chb\" (UniqueName: \"kubernetes.io/projected/edf043c0-bbd9-4411-a187-872e252bb850-kube-api-access-b9chb\") pod \"edf043c0-bbd9-4411-a187-872e252bb850\" (UID: \"edf043c0-bbd9-4411-a187-872e252bb850\") " Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880877 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-proxy-ca-bundles\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880926 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zc2b\" (UniqueName: \"kubernetes.io/projected/b5826295-3c46-4da4-9554-bd678e529c7b-kube-api-access-7zc2b\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880950 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5826295-3c46-4da4-9554-bd678e529c7b-serving-cert\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880975 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-config\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.880998 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-client-ca\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.881923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-client-ca" (OuterVolumeSpecName: "client-ca") pod "54900028-6d3f-4515-8262-f75588f98fb6" (UID: "54900028-6d3f-4515-8262-f75588f98fb6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.881984 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-client-ca" (OuterVolumeSpecName: "client-ca") pod "edf043c0-bbd9-4411-a187-872e252bb850" (UID: "edf043c0-bbd9-4411-a187-872e252bb850"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.882052 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "edf043c0-bbd9-4411-a187-872e252bb850" (UID: "edf043c0-bbd9-4411-a187-872e252bb850"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.882647 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-config" (OuterVolumeSpecName: "config") pod "edf043c0-bbd9-4411-a187-872e252bb850" (UID: "edf043c0-bbd9-4411-a187-872e252bb850"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.882874 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-config" (OuterVolumeSpecName: "config") pod "54900028-6d3f-4515-8262-f75588f98fb6" (UID: "54900028-6d3f-4515-8262-f75588f98fb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.885939 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf043c0-bbd9-4411-a187-872e252bb850-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "edf043c0-bbd9-4411-a187-872e252bb850" (UID: "edf043c0-bbd9-4411-a187-872e252bb850"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.886462 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf043c0-bbd9-4411-a187-872e252bb850-kube-api-access-b9chb" (OuterVolumeSpecName: "kube-api-access-b9chb") pod "edf043c0-bbd9-4411-a187-872e252bb850" (UID: "edf043c0-bbd9-4411-a187-872e252bb850"). InnerVolumeSpecName "kube-api-access-b9chb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.886577 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54900028-6d3f-4515-8262-f75588f98fb6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "54900028-6d3f-4515-8262-f75588f98fb6" (UID: "54900028-6d3f-4515-8262-f75588f98fb6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.894938 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54900028-6d3f-4515-8262-f75588f98fb6-kube-api-access-459q4" (OuterVolumeSpecName: "kube-api-access-459q4") pod "54900028-6d3f-4515-8262-f75588f98fb6" (UID: "54900028-6d3f-4515-8262-f75588f98fb6"). InnerVolumeSpecName "kube-api-access-459q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.981793 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-proxy-ca-bundles\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.981938 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zc2b\" (UniqueName: \"kubernetes.io/projected/b5826295-3c46-4da4-9554-bd678e529c7b-kube-api-access-7zc2b\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5826295-3c46-4da4-9554-bd678e529c7b-serving-cert\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982578 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-config\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982617 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-client-ca\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982792 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982815 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982866 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54900028-6d3f-4515-8262-f75588f98fb6-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982879 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982890 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54900028-6d3f-4515-8262-f75588f98fb6-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982901 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf043c0-bbd9-4411-a187-872e252bb850-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982915 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9chb\" (UniqueName: \"kubernetes.io/projected/edf043c0-bbd9-4411-a187-872e252bb850-kube-api-access-b9chb\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982928 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-459q4\" (UniqueName: \"kubernetes.io/projected/54900028-6d3f-4515-8262-f75588f98fb6-kube-api-access-459q4\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.982939 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf043c0-bbd9-4411-a187-872e252bb850-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.983023 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-proxy-ca-bundles\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.983402 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-config\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.983434 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-client-ca\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.985591 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5826295-3c46-4da4-9554-bd678e529c7b-serving-cert\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:06 crc kubenswrapper[4725]: I0225 10:57:06.996324 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zc2b\" (UniqueName: \"kubernetes.io/projected/b5826295-3c46-4da4-9554-bd678e529c7b-kube-api-access-7zc2b\") pod \"controller-manager-579444c495-dchx4\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.068618 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.451042 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" event={"ID":"edf043c0-bbd9-4411-a187-872e252bb850","Type":"ContainerDied","Data":"d2e257a106f00f95ae13d931857baf759428fb7263964fa82fb6d152d39d22d4"} Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.451082 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d985b9fd6-2zlh6" Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.451109 4725 scope.go:117] "RemoveContainer" containerID="ace3400106919f4a5300b9895bdc361a296c70f374443473a79b0f23753bfa50" Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.453715 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" event={"ID":"54900028-6d3f-4515-8262-f75588f98fb6","Type":"ContainerDied","Data":"8a358d8a9ea1fd6c8989bb73479ca4c5dbdafb6da9cbe1e46d8f3d43512ff461"} Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.453775 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn" Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.483580 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d985b9fd6-2zlh6"] Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.486699 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d985b9fd6-2zlh6"] Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.489117 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn"] Feb 25 10:57:07 crc kubenswrapper[4725]: I0225 10:57:07.491809 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b8f796-p5zrn"] Feb 25 10:57:08 crc kubenswrapper[4725]: I0225 10:57:08.270507 4725 ???:1] "http: TLS handshake error from 192.168.126.11:37054: no serving certificate available for the kubelet" Feb 25 10:57:09 crc kubenswrapper[4725]: I0225 10:57:09.239351 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54900028-6d3f-4515-8262-f75588f98fb6" path="/var/lib/kubelet/pods/54900028-6d3f-4515-8262-f75588f98fb6/volumes" Feb 25 10:57:09 crc kubenswrapper[4725]: I0225 10:57:09.240196 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edf043c0-bbd9-4411-a187-872e252bb850" path="/var/lib/kubelet/pods/edf043c0-bbd9-4411-a187-872e252bb850/volumes" Feb 25 10:57:10 crc kubenswrapper[4725]: E0225 10:57:10.092467 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 25 10:57:10 crc kubenswrapper[4725]: E0225 10:57:10.092647 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qcj2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n87p9_openshift-marketplace(c934ca68-7c23-4a8f-8e09-8d3edad1e1a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 10:57:10 crc kubenswrapper[4725]: E0225 10:57:10.093891 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n87p9" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" Feb 25 10:57:11 crc kubenswrapper[4725]: E0225 10:57:11.150984 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n87p9" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" Feb 25 10:57:11 crc kubenswrapper[4725]: E0225 10:57:11.203589 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 25 10:57:11 crc kubenswrapper[4725]: E0225 10:57:11.203752 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkj27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6c8m5_openshift-marketplace(34091911-8e18-4a85-b0c2-a07e3c1a7e28): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 10:57:11 crc kubenswrapper[4725]: E0225 10:57:11.205205 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6c8m5" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.283371 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn"] Feb 25 10:57:11 crc kubenswrapper[4725]: E0225 10:57:11.284432 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 25 10:57:11 crc kubenswrapper[4725]: E0225 10:57:11.284599 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b99hf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gx7mt_openshift-marketplace(8817d816-5958-4498-8a0d-528952c47e3a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 10:57:11 crc kubenswrapper[4725]: E0225 10:57:11.285803 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gx7mt" podUID="8817d816-5958-4498-8a0d-528952c47e3a" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.288940 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.292362 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.292930 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.293345 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.293564 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.293758 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.293977 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.307575 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn"] Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.348954 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257gp\" (UniqueName: \"kubernetes.io/projected/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-kube-api-access-257gp\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.349057 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-serving-cert\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.349121 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-config\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.349182 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-client-ca\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.450000 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-config\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.450090 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-client-ca\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.450142 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257gp\" (UniqueName: \"kubernetes.io/projected/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-kube-api-access-257gp\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.450178 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-serving-cert\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.450952 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-client-ca\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.451684 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-config\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.465157 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-serving-cert\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.480705 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257gp\" (UniqueName: \"kubernetes.io/projected/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-kube-api-access-257gp\") pod \"route-controller-manager-864dd6c844-gpmxn\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.530700 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nxmh5" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.561906 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.561968 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 10:57:11 crc kubenswrapper[4725]: I0225 10:57:11.609056 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:12 crc kubenswrapper[4725]: E0225 10:57:12.783778 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gx7mt" podUID="8817d816-5958-4498-8a0d-528952c47e3a" Feb 25 10:57:12 crc kubenswrapper[4725]: E0225 10:57:12.784070 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6c8m5" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" Feb 25 10:57:12 crc kubenswrapper[4725]: E0225 10:57:12.865910 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 25 10:57:12 crc kubenswrapper[4725]: E0225 10:57:12.866075 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gl4qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qjxjp_openshift-marketplace(8f0d98c3-7ffa-4029-ab5c-c252062b3099): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 10:57:12 crc kubenswrapper[4725]: E0225 10:57:12.867538 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qjxjp" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" Feb 25 10:57:13 crc kubenswrapper[4725]: I0225 10:57:13.960473 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-579444c495-dchx4"] Feb 25 10:57:14 crc kubenswrapper[4725]: I0225 10:57:14.063892 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn"] Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.428727 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qjxjp" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" Feb 25 10:57:14 crc kubenswrapper[4725]: I0225 10:57:14.465117 4725 scope.go:117] "RemoveContainer" containerID="ac8c54ffae9254f3c4b9ece14cc1eb86726ce57e8108a6c1f4e2599dc9288e76" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.502576 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.503100 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7zw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l2tdp_openshift-marketplace(d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.504359 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l2tdp" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.525987 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.526145 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vgnwr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bq27c_openshift-marketplace(dec8f4b6-001e-4ce7-b6d4-55b197612a38): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.530282 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bq27c" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.546442 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.546591 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxw9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dcstn_openshift-marketplace(47446d07-b5cf-4646-b54b-0e841fb3a662): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 10:57:14 crc kubenswrapper[4725]: E0225 10:57:14.548801 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dcstn" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" Feb 25 10:57:14 crc kubenswrapper[4725]: I0225 10:57:14.785372 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-579444c495-dchx4"] Feb 25 10:57:14 crc kubenswrapper[4725]: W0225 10:57:14.796210 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5826295_3c46_4da4_9554_bd678e529c7b.slice/crio-ece4c82c4a59eb0e8f9524642ac4153ecb02ac993053486ff67fc242bc9abd24 WatchSource:0}: Error finding container ece4c82c4a59eb0e8f9524642ac4153ecb02ac993053486ff67fc242bc9abd24: Status 404 returned error can't find the container with id ece4c82c4a59eb0e8f9524642ac4153ecb02ac993053486ff67fc242bc9abd24 Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.034563 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn"] Feb 25 10:57:15 crc kubenswrapper[4725]: W0225 10:57:15.039701 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6e73c6_a672_44c2_916e_ab0ecb8256ed.slice/crio-2dd08456e5737d705c3f0f5144db1547bb0f8fcae8d82e699dbfbf80ddc7bd22 WatchSource:0}: Error finding container 2dd08456e5737d705c3f0f5144db1547bb0f8fcae8d82e699dbfbf80ddc7bd22: Status 404 returned error can't find the container with id 2dd08456e5737d705c3f0f5144db1547bb0f8fcae8d82e699dbfbf80ddc7bd22 Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.495466 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" event={"ID":"b5826295-3c46-4da4-9554-bd678e529c7b","Type":"ContainerStarted","Data":"adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457"} Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.495773 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" event={"ID":"b5826295-3c46-4da4-9554-bd678e529c7b","Type":"ContainerStarted","Data":"ece4c82c4a59eb0e8f9524642ac4153ecb02ac993053486ff67fc242bc9abd24"} Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.495790 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.495929 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" podUID="b5826295-3c46-4da4-9554-bd678e529c7b" containerName="controller-manager" containerID="cri-o://adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457" gracePeriod=30 Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.498472 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" event={"ID":"0f6e73c6-a672-44c2-916e-ab0ecb8256ed","Type":"ContainerStarted","Data":"954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca"} Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.498524 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" event={"ID":"0f6e73c6-a672-44c2-916e-ab0ecb8256ed","Type":"ContainerStarted","Data":"2dd08456e5737d705c3f0f5144db1547bb0f8fcae8d82e699dbfbf80ddc7bd22"} Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.498624 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" podUID="0f6e73c6-a672-44c2-916e-ab0ecb8256ed" containerName="route-controller-manager" containerID="cri-o://954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca" gracePeriod=30 Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.498849 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.502152 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.502565 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t54lf" event={"ID":"85249796-156c-4e21-81ee-d4cca9c8a607","Type":"ContainerStarted","Data":"b6669f7a05a7046086fc2f480ed4d3967a8ab3b212433ec6d937674d0250a200"} Feb 25 10:57:15 crc kubenswrapper[4725]: E0225 10:57:15.508359 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bq27c" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" Feb 25 10:57:15 crc kubenswrapper[4725]: E0225 10:57:15.508467 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dcstn" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" Feb 25 10:57:15 crc kubenswrapper[4725]: E0225 10:57:15.508544 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l2tdp" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.515139 4725 patch_prober.go:28] interesting pod/route-controller-manager-864dd6c844-gpmxn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:43884->10.217.0.58:8443: read: connection reset by peer" start-of-body= Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.515191 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" podUID="0f6e73c6-a672-44c2-916e-ab0ecb8256ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:43884->10.217.0.58:8443: read: connection reset by peer" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.528435 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" podStartSLOduration=21.528418095 podStartE2EDuration="21.528418095s" podCreationTimestamp="2026-02-25 10:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:15.527674294 +0000 UTC m=+261.026256339" watchObservedRunningTime="2026-02-25 10:57:15.528418095 +0000 UTC m=+261.027000120" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.528941 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" podStartSLOduration=22.52893237 podStartE2EDuration="22.52893237s" podCreationTimestamp="2026-02-25 10:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:15.513332448 +0000 UTC m=+261.011914483" watchObservedRunningTime="2026-02-25 10:57:15.52893237 +0000 UTC m=+261.027514395" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.838212 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.838998 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.843036 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.843371 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.856810 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.875086 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-864dd6c844-gpmxn_0f6e73c6-a672-44c2-916e-ab0ecb8256ed/route-controller-manager/0.log" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.875149 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.897245 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.900092 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7"] Feb 25 10:57:15 crc kubenswrapper[4725]: E0225 10:57:15.900400 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6e73c6-a672-44c2-916e-ab0ecb8256ed" containerName="route-controller-manager" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.900420 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6e73c6-a672-44c2-916e-ab0ecb8256ed" containerName="route-controller-manager" Feb 25 10:57:15 crc kubenswrapper[4725]: E0225 10:57:15.900432 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5826295-3c46-4da4-9554-bd678e529c7b" containerName="controller-manager" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.900440 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5826295-3c46-4da4-9554-bd678e529c7b" containerName="controller-manager" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.901625 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5826295-3c46-4da4-9554-bd678e529c7b" containerName="controller-manager" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.901662 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6e73c6-a672-44c2-916e-ab0ecb8256ed" containerName="route-controller-manager" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.902293 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:15 crc kubenswrapper[4725]: I0225 10:57:15.914959 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7"] Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020434 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-config\") pod \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020594 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-proxy-ca-bundles\") pod \"b5826295-3c46-4da4-9554-bd678e529c7b\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020661 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-client-ca\") pod \"b5826295-3c46-4da4-9554-bd678e529c7b\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020687 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-serving-cert\") pod \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020706 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-client-ca\") pod \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020784 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5826295-3c46-4da4-9554-bd678e529c7b-serving-cert\") pod \"b5826295-3c46-4da4-9554-bd678e529c7b\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020838 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zc2b\" (UniqueName: \"kubernetes.io/projected/b5826295-3c46-4da4-9554-bd678e529c7b-kube-api-access-7zc2b\") pod \"b5826295-3c46-4da4-9554-bd678e529c7b\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020865 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-config\") pod \"b5826295-3c46-4da4-9554-bd678e529c7b\" (UID: \"b5826295-3c46-4da4-9554-bd678e529c7b\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.020893 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-257gp\" (UniqueName: \"kubernetes.io/projected/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-kube-api-access-257gp\") pod \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\" (UID: \"0f6e73c6-a672-44c2-916e-ab0ecb8256ed\") " Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021084 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-client-ca\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021117 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f451d5-03ac-49ed-9af4-13006078d6db-serving-cert\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021142 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d055d9-6133-4f62-9627-32b6e79697ba-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021173 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpsr\" (UniqueName: \"kubernetes.io/projected/52f451d5-03ac-49ed-9af4-13006078d6db-kube-api-access-4qpsr\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021262 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-config\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021391 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d055d9-6133-4f62-9627-32b6e79697ba-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021448 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b5826295-3c46-4da4-9554-bd678e529c7b" (UID: "b5826295-3c46-4da4-9554-bd678e529c7b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021456 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5826295-3c46-4da4-9554-bd678e529c7b" (UID: "b5826295-3c46-4da4-9554-bd678e529c7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021557 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021573 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021546 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f6e73c6-a672-44c2-916e-ab0ecb8256ed" (UID: "0f6e73c6-a672-44c2-916e-ab0ecb8256ed"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.021719 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-config" (OuterVolumeSpecName: "config") pod "0f6e73c6-a672-44c2-916e-ab0ecb8256ed" (UID: "0f6e73c6-a672-44c2-916e-ab0ecb8256ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.022045 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-config" (OuterVolumeSpecName: "config") pod "b5826295-3c46-4da4-9554-bd678e529c7b" (UID: "b5826295-3c46-4da4-9554-bd678e529c7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.026715 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f6e73c6-a672-44c2-916e-ab0ecb8256ed" (UID: "0f6e73c6-a672-44c2-916e-ab0ecb8256ed"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.026744 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5826295-3c46-4da4-9554-bd678e529c7b-kube-api-access-7zc2b" (OuterVolumeSpecName: "kube-api-access-7zc2b") pod "b5826295-3c46-4da4-9554-bd678e529c7b" (UID: "b5826295-3c46-4da4-9554-bd678e529c7b"). InnerVolumeSpecName "kube-api-access-7zc2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.026847 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5826295-3c46-4da4-9554-bd678e529c7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5826295-3c46-4da4-9554-bd678e529c7b" (UID: "b5826295-3c46-4da4-9554-bd678e529c7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.028932 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-kube-api-access-257gp" (OuterVolumeSpecName: "kube-api-access-257gp") pod "0f6e73c6-a672-44c2-916e-ab0ecb8256ed" (UID: "0f6e73c6-a672-44c2-916e-ab0ecb8256ed"). InnerVolumeSpecName "kube-api-access-257gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.121942 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-client-ca\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122006 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f451d5-03ac-49ed-9af4-13006078d6db-serving-cert\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122030 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d055d9-6133-4f62-9627-32b6e79697ba-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122060 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpsr\" (UniqueName: \"kubernetes.io/projected/52f451d5-03ac-49ed-9af4-13006078d6db-kube-api-access-4qpsr\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122086 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-config\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122132 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d055d9-6133-4f62-9627-32b6e79697ba-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122186 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5826295-3c46-4da4-9554-bd678e529c7b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122198 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zc2b\" (UniqueName: \"kubernetes.io/projected/b5826295-3c46-4da4-9554-bd678e529c7b-kube-api-access-7zc2b\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122207 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5826295-3c46-4da4-9554-bd678e529c7b-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122215 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-257gp\" (UniqueName: \"kubernetes.io/projected/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-kube-api-access-257gp\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122224 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122234 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122241 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6e73c6-a672-44c2-916e-ab0ecb8256ed-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.122294 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d055d9-6133-4f62-9627-32b6e79697ba-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.123323 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-client-ca\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.123346 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-config\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.131015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f451d5-03ac-49ed-9af4-13006078d6db-serving-cert\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.135339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d055d9-6133-4f62-9627-32b6e79697ba-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.139409 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpsr\" (UniqueName: \"kubernetes.io/projected/52f451d5-03ac-49ed-9af4-13006078d6db-kube-api-access-4qpsr\") pod \"route-controller-manager-6748f969f4-m7rz7\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.193401 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.220233 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.371396 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.427329 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7"] Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.512189 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-864dd6c844-gpmxn_0f6e73c6-a672-44c2-916e-ab0ecb8256ed/route-controller-manager/0.log" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.513777 4725 generic.go:334] "Generic (PLEG): container finished" podID="0f6e73c6-a672-44c2-916e-ab0ecb8256ed" containerID="954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca" exitCode=255 Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.513862 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" event={"ID":"0f6e73c6-a672-44c2-916e-ab0ecb8256ed","Type":"ContainerDied","Data":"954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca"} Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.513932 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" event={"ID":"0f6e73c6-a672-44c2-916e-ab0ecb8256ed","Type":"ContainerDied","Data":"2dd08456e5737d705c3f0f5144db1547bb0f8fcae8d82e699dbfbf80ddc7bd22"} Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.513874 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.513956 4725 scope.go:117] "RemoveContainer" containerID="954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.524790 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" event={"ID":"52f451d5-03ac-49ed-9af4-13006078d6db","Type":"ContainerStarted","Data":"cc098d8da65c2deebba4af1074252cd4b6c878c36ba9a8302225bdeedb27be4d"} Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.541968 4725 generic.go:334] "Generic (PLEG): container finished" podID="85249796-156c-4e21-81ee-d4cca9c8a607" containerID="b6669f7a05a7046086fc2f480ed4d3967a8ab3b212433ec6d937674d0250a200" exitCode=0 Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.542250 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t54lf" event={"ID":"85249796-156c-4e21-81ee-d4cca9c8a607","Type":"ContainerDied","Data":"b6669f7a05a7046086fc2f480ed4d3967a8ab3b212433ec6d937674d0250a200"} Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.546200 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"21d055d9-6133-4f62-9627-32b6e79697ba","Type":"ContainerStarted","Data":"3d1155d3d221a9fa92a5598884634cc0c30063f6642864f2255d4f2e03e882ec"} Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.552902 4725 scope.go:117] "RemoveContainer" containerID="954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca" Feb 25 10:57:16 crc kubenswrapper[4725]: E0225 10:57:16.553267 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca\": container with ID starting with 954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca not found: ID does not exist" containerID="954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.553302 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca"} err="failed to get container status \"954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca\": rpc error: code = NotFound desc = could not find container \"954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca\": container with ID starting with 954b03dbddd705cf83bb46d23b4a243206b5bbb63623b7f9c8acd7321ae937ca not found: ID does not exist" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.556059 4725 generic.go:334] "Generic (PLEG): container finished" podID="b5826295-3c46-4da4-9554-bd678e529c7b" containerID="adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457" exitCode=0 Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.556095 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" event={"ID":"b5826295-3c46-4da4-9554-bd678e529c7b","Type":"ContainerDied","Data":"adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457"} Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.556119 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" event={"ID":"b5826295-3c46-4da4-9554-bd678e529c7b","Type":"ContainerDied","Data":"ece4c82c4a59eb0e8f9524642ac4153ecb02ac993053486ff67fc242bc9abd24"} Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.556133 4725 scope.go:117] "RemoveContainer" containerID="adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.556233 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579444c495-dchx4" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.584873 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn"] Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.591188 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864dd6c844-gpmxn"] Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.596121 4725 scope.go:117] "RemoveContainer" containerID="adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.596217 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-579444c495-dchx4"] Feb 25 10:57:16 crc kubenswrapper[4725]: E0225 10:57:16.596745 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457\": container with ID starting with adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457 not found: ID does not exist" containerID="adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.596785 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457"} err="failed to get container status \"adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457\": rpc error: code = NotFound desc = could not find container \"adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457\": container with ID starting with adc3dcc23115c9fc761e6dd2ff1263dfbbc561ec11e398a451da1cafaada7457 not found: ID does not exist" Feb 25 10:57:16 crc kubenswrapper[4725]: I0225 10:57:16.601269 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-579444c495-dchx4"] Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.207546 4725 csr.go:261] certificate signing request csr-l4fpn is approved, waiting to be issued Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.215489 4725 csr.go:257] certificate signing request csr-l4fpn is issued Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.230800 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6e73c6-a672-44c2-916e-ab0ecb8256ed" path="/var/lib/kubelet/pods/0f6e73c6-a672-44c2-916e-ab0ecb8256ed/volumes" Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.231698 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5826295-3c46-4da4-9554-bd678e529c7b" path="/var/lib/kubelet/pods/b5826295-3c46-4da4-9554-bd678e529c7b/volumes" Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.563985 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t54lf" event={"ID":"85249796-156c-4e21-81ee-d4cca9c8a607","Type":"ContainerStarted","Data":"ae07f7d3fb90d681f57053b60eab106d7556e64e586bba9421777633a3d0b0cc"} Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.565906 4725 generic.go:334] "Generic (PLEG): container finished" podID="21d055d9-6133-4f62-9627-32b6e79697ba" containerID="55aaf4e5c8012ca8d53de4a0b8f9f79e2689463e844123baf923b89a6ceecb63" exitCode=0 Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.566026 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"21d055d9-6133-4f62-9627-32b6e79697ba","Type":"ContainerDied","Data":"55aaf4e5c8012ca8d53de4a0b8f9f79e2689463e844123baf923b89a6ceecb63"} Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.570489 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" event={"ID":"52f451d5-03ac-49ed-9af4-13006078d6db","Type":"ContainerStarted","Data":"2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0"} Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.570701 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.576559 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.578958 4725 generic.go:334] "Generic (PLEG): container finished" podID="b0b17a01-64f4-4578-9e56-19825cfa713f" containerID="c3291a321087990616222f8b5a2cd186793e4e37830a806da78e2f830bac57b7" exitCode=0 Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.579009 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" event={"ID":"b0b17a01-64f4-4578-9e56-19825cfa713f","Type":"ContainerDied","Data":"c3291a321087990616222f8b5a2cd186793e4e37830a806da78e2f830bac57b7"} Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.583563 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t54lf" podStartSLOduration=3.603282785 podStartE2EDuration="38.583549449s" podCreationTimestamp="2026-02-25 10:56:39 +0000 UTC" firstStartedPulling="2026-02-25 10:56:42.211366972 +0000 UTC m=+227.709948997" lastFinishedPulling="2026-02-25 10:57:17.191633636 +0000 UTC m=+262.690215661" observedRunningTime="2026-02-25 10:57:17.582525999 +0000 UTC m=+263.081108054" watchObservedRunningTime="2026-02-25 10:57:17.583549449 +0000 UTC m=+263.082131484" Feb 25 10:57:17 crc kubenswrapper[4725]: I0225 10:57:17.626169 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" podStartSLOduration=3.626145104 podStartE2EDuration="3.626145104s" podCreationTimestamp="2026-02-25 10:57:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:17.606468904 +0000 UTC m=+263.105050959" watchObservedRunningTime="2026-02-25 10:57:17.626145104 +0000 UTC m=+263.124727129" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.216820 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 08:27:30.714766334 +0000 UTC Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.216874 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7413h30m12.497894263s for next certificate rotation Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.285277 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6987695655-fxzbt"] Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.285918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.287760 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.288134 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.288361 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.288775 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.288877 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.289192 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.295355 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.295405 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6987695655-fxzbt"] Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.454316 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-client-ca\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.454370 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-proxy-ca-bundles\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.454407 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-config\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.454447 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx7g9\" (UniqueName: \"kubernetes.io/projected/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-kube-api-access-hx7g9\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.454606 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-serving-cert\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.556143 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-serving-cert\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.556237 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-client-ca\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.556265 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-proxy-ca-bundles\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.556303 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-config\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.556396 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx7g9\" (UniqueName: \"kubernetes.io/projected/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-kube-api-access-hx7g9\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.557339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-proxy-ca-bundles\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.557620 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-config\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.558207 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-client-ca\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.563634 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-serving-cert\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.581872 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx7g9\" (UniqueName: \"kubernetes.io/projected/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-kube-api-access-hx7g9\") pod \"controller-manager-6987695655-fxzbt\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.599260 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.799056 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.829458 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.961130 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ll8c\" (UniqueName: \"kubernetes.io/projected/b0b17a01-64f4-4578-9e56-19825cfa713f-kube-api-access-9ll8c\") pod \"b0b17a01-64f4-4578-9e56-19825cfa713f\" (UID: \"b0b17a01-64f4-4578-9e56-19825cfa713f\") " Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.961186 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d055d9-6133-4f62-9627-32b6e79697ba-kubelet-dir\") pod \"21d055d9-6133-4f62-9627-32b6e79697ba\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.961295 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d055d9-6133-4f62-9627-32b6e79697ba-kube-api-access\") pod \"21d055d9-6133-4f62-9627-32b6e79697ba\" (UID: \"21d055d9-6133-4f62-9627-32b6e79697ba\") " Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.961314 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21d055d9-6133-4f62-9627-32b6e79697ba-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "21d055d9-6133-4f62-9627-32b6e79697ba" (UID: "21d055d9-6133-4f62-9627-32b6e79697ba"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.961577 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d055d9-6133-4f62-9627-32b6e79697ba-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.966065 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d055d9-6133-4f62-9627-32b6e79697ba-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "21d055d9-6133-4f62-9627-32b6e79697ba" (UID: "21d055d9-6133-4f62-9627-32b6e79697ba"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:18 crc kubenswrapper[4725]: I0225 10:57:18.966164 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b17a01-64f4-4578-9e56-19825cfa713f-kube-api-access-9ll8c" (OuterVolumeSpecName: "kube-api-access-9ll8c") pod "b0b17a01-64f4-4578-9e56-19825cfa713f" (UID: "b0b17a01-64f4-4578-9e56-19825cfa713f"). InnerVolumeSpecName "kube-api-access-9ll8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.042669 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6987695655-fxzbt"] Feb 25 10:57:19 crc kubenswrapper[4725]: W0225 10:57:19.048134 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa4f6a9_4b0f_47a8_b61f_2e8d98ce4705.slice/crio-18098106a67b14df08901b04fae05dba5b49ba40e960b11736cf734acd53eda5 WatchSource:0}: Error finding container 18098106a67b14df08901b04fae05dba5b49ba40e960b11736cf734acd53eda5: Status 404 returned error can't find the container with id 18098106a67b14df08901b04fae05dba5b49ba40e960b11736cf734acd53eda5 Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.062465 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d055d9-6133-4f62-9627-32b6e79697ba-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.062529 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ll8c\" (UniqueName: \"kubernetes.io/projected/b0b17a01-64f4-4578-9e56-19825cfa713f-kube-api-access-9ll8c\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.218551 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-07 04:28:21.796335695 +0000 UTC Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.218805 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7577h31m2.577532757s for next certificate rotation Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.592120 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" event={"ID":"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705","Type":"ContainerStarted","Data":"24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e"} Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.592167 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" event={"ID":"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705","Type":"ContainerStarted","Data":"18098106a67b14df08901b04fae05dba5b49ba40e960b11736cf734acd53eda5"} Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.592353 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.594515 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.594560 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533616-zsh9g" event={"ID":"b0b17a01-64f4-4578-9e56-19825cfa713f","Type":"ContainerDied","Data":"2d9b3fc9c0db4a55aff037595f1438561a1d93f5c574561047a2b34bc933b843"} Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.594605 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d9b3fc9c0db4a55aff037595f1438561a1d93f5c574561047a2b34bc933b843" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.597554 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.598041 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"21d055d9-6133-4f62-9627-32b6e79697ba","Type":"ContainerDied","Data":"3d1155d3d221a9fa92a5598884634cc0c30063f6642864f2255d4f2e03e882ec"} Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.598068 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.598077 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d1155d3d221a9fa92a5598884634cc0c30063f6642864f2255d4f2e03e882ec" Feb 25 10:57:19 crc kubenswrapper[4725]: I0225 10:57:19.617290 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" podStartSLOduration=6.617271991 podStartE2EDuration="6.617271991s" podCreationTimestamp="2026-02-25 10:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:19.615714336 +0000 UTC m=+265.114296371" watchObservedRunningTime="2026-02-25 10:57:19.617271991 +0000 UTC m=+265.115854016" Feb 25 10:57:20 crc kubenswrapper[4725]: I0225 10:57:20.301161 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:57:20 crc kubenswrapper[4725]: I0225 10:57:20.301238 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:57:21 crc kubenswrapper[4725]: I0225 10:57:21.668007 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t54lf" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="registry-server" probeResult="failure" output=< Feb 25 10:57:21 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 25 10:57:21 crc kubenswrapper[4725]: > Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.233096 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 10:57:23 crc kubenswrapper[4725]: E0225 10:57:23.233288 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d055d9-6133-4f62-9627-32b6e79697ba" containerName="pruner" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.233299 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d055d9-6133-4f62-9627-32b6e79697ba" containerName="pruner" Feb 25 10:57:23 crc kubenswrapper[4725]: E0225 10:57:23.233319 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b17a01-64f4-4578-9e56-19825cfa713f" containerName="oc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.233324 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b17a01-64f4-4578-9e56-19825cfa713f" containerName="oc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.233419 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b17a01-64f4-4578-9e56-19825cfa713f" containerName="oc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.233428 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d055d9-6133-4f62-9627-32b6e79697ba" containerName="pruner" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.233781 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.236315 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.237985 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.248689 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.316329 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.316441 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-var-lock\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.316524 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kube-api-access\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.417333 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-var-lock\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.417459 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kube-api-access\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.417464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-var-lock\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.417493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.417587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.439470 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kube-api-access\") pod \"installer-9-crc\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.587005 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:57:23 crc kubenswrapper[4725]: I0225 10:57:23.978519 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 10:57:24 crc kubenswrapper[4725]: I0225 10:57:24.623601 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76bf95fe-88cd-4f68-a0d4-a5059c8b666a","Type":"ContainerStarted","Data":"3a6ae950ef7ebf2cc7120ed1c9de1df1af2bf779fa5e5b6f4ad8fd216cf5a4ba"} Feb 25 10:57:24 crc kubenswrapper[4725]: I0225 10:57:24.623957 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76bf95fe-88cd-4f68-a0d4-a5059c8b666a","Type":"ContainerStarted","Data":"e4233e5fafca5a0c1a8cc99295315eedde85cfc51842b5302ef244ea761fe0b8"} Feb 25 10:57:24 crc kubenswrapper[4725]: I0225 10:57:24.645366 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.645298866 podStartE2EDuration="1.645298866s" podCreationTimestamp="2026-02-25 10:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:24.637664365 +0000 UTC m=+270.136246390" watchObservedRunningTime="2026-02-25 10:57:24.645298866 +0000 UTC m=+270.143880891" Feb 25 10:57:26 crc kubenswrapper[4725]: I0225 10:57:26.641999 4725 generic.go:334] "Generic (PLEG): container finished" podID="8817d816-5958-4498-8a0d-528952c47e3a" containerID="5e49d4fb2df3a0793878af6b7d965fb5ea571752b9e83e47de9d3418b700c559" exitCode=0 Feb 25 10:57:26 crc kubenswrapper[4725]: I0225 10:57:26.642156 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7mt" event={"ID":"8817d816-5958-4498-8a0d-528952c47e3a","Type":"ContainerDied","Data":"5e49d4fb2df3a0793878af6b7d965fb5ea571752b9e83e47de9d3418b700c559"} Feb 25 10:57:26 crc kubenswrapper[4725]: I0225 10:57:26.647402 4725 generic.go:334] "Generic (PLEG): container finished" podID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerID="b820657fdd9e54d4ef0bdb02170f3e7fe45c81e5514022e13635bb86fb461be6" exitCode=0 Feb 25 10:57:26 crc kubenswrapper[4725]: I0225 10:57:26.647440 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c8m5" event={"ID":"34091911-8e18-4a85-b0c2-a07e3c1a7e28","Type":"ContainerDied","Data":"b820657fdd9e54d4ef0bdb02170f3e7fe45c81e5514022e13635bb86fb461be6"} Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.665965 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tdp" event={"ID":"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d","Type":"ContainerStarted","Data":"32a2ff4338245e34564a739b7d146279013b13f5fa8c486eeeaf31f86f4ef8cc"} Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.671421 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjxjp" event={"ID":"8f0d98c3-7ffa-4029-ab5c-c252062b3099","Type":"ContainerStarted","Data":"8e88f37ae88e7ecaf1325e36a47cb9c7db6b1b4d75901f1ba04e427c12f0b843"} Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.674265 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n87p9" event={"ID":"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5","Type":"ContainerStarted","Data":"af25536dc547144a10c7873e8381c55f1457684a63749a97a3083bc4226383e6"} Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.677201 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c8m5" event={"ID":"34091911-8e18-4a85-b0c2-a07e3c1a7e28","Type":"ContainerStarted","Data":"b6aa6d0cb717b5cee60b22bc2cb6728496633a69bbe01ea2a946d586385c6f2c"} Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.680526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7mt" event={"ID":"8817d816-5958-4498-8a0d-528952c47e3a","Type":"ContainerStarted","Data":"7c4eb0b109b1a105a572b1bd97ac4f58436abb803aaddef36e3d17c134f4ff7c"} Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.683111 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bq27c" event={"ID":"dec8f4b6-001e-4ce7-b6d4-55b197612a38","Type":"ContainerStarted","Data":"ba9566c85ccf85b3c88cb6213bad7ca072baa47e90d6bec4d0d19f114205da28"} Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.718650 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gx7mt" podStartSLOduration=2.623734507 podStartE2EDuration="50.718633184s" podCreationTimestamp="2026-02-25 10:56:39 +0000 UTC" firstStartedPulling="2026-02-25 10:56:41.136746469 +0000 UTC m=+226.635328484" lastFinishedPulling="2026-02-25 10:57:29.231645126 +0000 UTC m=+274.730227161" observedRunningTime="2026-02-25 10:57:29.714894046 +0000 UTC m=+275.213476061" watchObservedRunningTime="2026-02-25 10:57:29.718633184 +0000 UTC m=+275.217215209" Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.737436 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6c8m5" podStartSLOduration=2.67824739 podStartE2EDuration="51.737421089s" podCreationTimestamp="2026-02-25 10:56:38 +0000 UTC" firstStartedPulling="2026-02-25 10:56:40.049398029 +0000 UTC m=+225.547980054" lastFinishedPulling="2026-02-25 10:57:29.108571678 +0000 UTC m=+274.607153753" observedRunningTime="2026-02-25 10:57:29.733891196 +0000 UTC m=+275.232473221" watchObservedRunningTime="2026-02-25 10:57:29.737421089 +0000 UTC m=+275.236003114" Feb 25 10:57:29 crc kubenswrapper[4725]: I0225 10:57:29.907097 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6trwd"] Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.346086 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.381509 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.689053 4725 generic.go:334] "Generic (PLEG): container finished" podID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerID="ba9566c85ccf85b3c88cb6213bad7ca072baa47e90d6bec4d0d19f114205da28" exitCode=0 Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.689131 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bq27c" event={"ID":"dec8f4b6-001e-4ce7-b6d4-55b197612a38","Type":"ContainerDied","Data":"ba9566c85ccf85b3c88cb6213bad7ca072baa47e90d6bec4d0d19f114205da28"} Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.692423 4725 generic.go:334] "Generic (PLEG): container finished" podID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerID="32a2ff4338245e34564a739b7d146279013b13f5fa8c486eeeaf31f86f4ef8cc" exitCode=0 Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.692509 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tdp" event={"ID":"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d","Type":"ContainerDied","Data":"32a2ff4338245e34564a739b7d146279013b13f5fa8c486eeeaf31f86f4ef8cc"} Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.694330 4725 generic.go:334] "Generic (PLEG): container finished" podID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerID="8e88f37ae88e7ecaf1325e36a47cb9c7db6b1b4d75901f1ba04e427c12f0b843" exitCode=0 Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.694404 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjxjp" event={"ID":"8f0d98c3-7ffa-4029-ab5c-c252062b3099","Type":"ContainerDied","Data":"8e88f37ae88e7ecaf1325e36a47cb9c7db6b1b4d75901f1ba04e427c12f0b843"} Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.697459 4725 generic.go:334] "Generic (PLEG): container finished" podID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerID="af25536dc547144a10c7873e8381c55f1457684a63749a97a3083bc4226383e6" exitCode=0 Feb 25 10:57:30 crc kubenswrapper[4725]: I0225 10:57:30.697498 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n87p9" event={"ID":"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5","Type":"ContainerDied","Data":"af25536dc547144a10c7873e8381c55f1457684a63749a97a3083bc4226383e6"} Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.705048 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n87p9" event={"ID":"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5","Type":"ContainerStarted","Data":"e0df7d9b0e7655104cce99e2802ab775c0b774e714c76d212a7984bbe56d267e"} Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.707880 4725 generic.go:334] "Generic (PLEG): container finished" podID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerID="e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242" exitCode=0 Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.707941 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dcstn" event={"ID":"47446d07-b5cf-4646-b54b-0e841fb3a662","Type":"ContainerDied","Data":"e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242"} Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.710279 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bq27c" event={"ID":"dec8f4b6-001e-4ce7-b6d4-55b197612a38","Type":"ContainerStarted","Data":"92d7a0707f23c25b59c2f86679f48462f3ec1c333b56b23830e7b275ede09595"} Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.714532 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tdp" event={"ID":"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d","Type":"ContainerStarted","Data":"b677fbff2f3f581bfa3558d28c0c83f69a3ec4bd8085b0eccda8263e35419e27"} Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.716591 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjxjp" event={"ID":"8f0d98c3-7ffa-4029-ab5c-c252062b3099","Type":"ContainerStarted","Data":"7fa51daade8fd55cabfe4a6034a853e9418f682d31aedcf824c57c0d43a1972e"} Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.730616 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n87p9" podStartSLOduration=2.783545657 podStartE2EDuration="51.730592597s" podCreationTimestamp="2026-02-25 10:56:40 +0000 UTC" firstStartedPulling="2026-02-25 10:56:42.200481562 +0000 UTC m=+227.699063587" lastFinishedPulling="2026-02-25 10:57:31.147528502 +0000 UTC m=+276.646110527" observedRunningTime="2026-02-25 10:57:31.72898102 +0000 UTC m=+277.227563055" watchObservedRunningTime="2026-02-25 10:57:31.730592597 +0000 UTC m=+277.229174622" Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.746312 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjxjp" podStartSLOduration=3.521313798 podStartE2EDuration="55.746291242s" podCreationTimestamp="2026-02-25 10:56:36 +0000 UTC" firstStartedPulling="2026-02-25 10:56:39.006748549 +0000 UTC m=+224.505330574" lastFinishedPulling="2026-02-25 10:57:31.231725993 +0000 UTC m=+276.730308018" observedRunningTime="2026-02-25 10:57:31.743179362 +0000 UTC m=+277.241761407" watchObservedRunningTime="2026-02-25 10:57:31.746291242 +0000 UTC m=+277.244873287" Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.763333 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2tdp" podStartSLOduration=3.592660254 podStartE2EDuration="55.763310805s" podCreationTimestamp="2026-02-25 10:56:36 +0000 UTC" firstStartedPulling="2026-02-25 10:56:39.00443954 +0000 UTC m=+224.503021565" lastFinishedPulling="2026-02-25 10:57:31.175090091 +0000 UTC m=+276.673672116" observedRunningTime="2026-02-25 10:57:31.760954457 +0000 UTC m=+277.259536482" watchObservedRunningTime="2026-02-25 10:57:31.763310805 +0000 UTC m=+277.261892850" Feb 25 10:57:31 crc kubenswrapper[4725]: I0225 10:57:31.784569 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bq27c" podStartSLOduration=2.699761633 podStartE2EDuration="54.784548891s" podCreationTimestamp="2026-02-25 10:56:37 +0000 UTC" firstStartedPulling="2026-02-25 10:56:38.994216457 +0000 UTC m=+224.492798482" lastFinishedPulling="2026-02-25 10:57:31.079003715 +0000 UTC m=+276.577585740" observedRunningTime="2026-02-25 10:57:31.781131482 +0000 UTC m=+277.279713517" watchObservedRunningTime="2026-02-25 10:57:31.784548891 +0000 UTC m=+277.283130916" Feb 25 10:57:32 crc kubenswrapper[4725]: I0225 10:57:32.724205 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dcstn" event={"ID":"47446d07-b5cf-4646-b54b-0e841fb3a662","Type":"ContainerStarted","Data":"0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5"} Feb 25 10:57:32 crc kubenswrapper[4725]: I0225 10:57:32.740422 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dcstn" podStartSLOduration=2.500831956 podStartE2EDuration="55.740406985s" podCreationTimestamp="2026-02-25 10:56:37 +0000 UTC" firstStartedPulling="2026-02-25 10:56:38.991176739 +0000 UTC m=+224.489758764" lastFinishedPulling="2026-02-25 10:57:32.230751768 +0000 UTC m=+277.729333793" observedRunningTime="2026-02-25 10:57:32.738577842 +0000 UTC m=+278.237159867" watchObservedRunningTime="2026-02-25 10:57:32.740406985 +0000 UTC m=+278.238989010" Feb 25 10:57:33 crc kubenswrapper[4725]: I0225 10:57:33.970045 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6987695655-fxzbt"] Feb 25 10:57:33 crc kubenswrapper[4725]: I0225 10:57:33.970270 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" podUID="cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" containerName="controller-manager" containerID="cri-o://24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e" gracePeriod=30 Feb 25 10:57:33 crc kubenswrapper[4725]: I0225 10:57:33.992295 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7"] Feb 25 10:57:33 crc kubenswrapper[4725]: I0225 10:57:33.992566 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" podUID="52f451d5-03ac-49ed-9af4-13006078d6db" containerName="route-controller-manager" containerID="cri-o://2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0" gracePeriod=30 Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.515008 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.567679 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694643 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qpsr\" (UniqueName: \"kubernetes.io/projected/52f451d5-03ac-49ed-9af4-13006078d6db-kube-api-access-4qpsr\") pod \"52f451d5-03ac-49ed-9af4-13006078d6db\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694692 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-client-ca\") pod \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694743 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f451d5-03ac-49ed-9af4-13006078d6db-serving-cert\") pod \"52f451d5-03ac-49ed-9af4-13006078d6db\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694769 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-config\") pod \"52f451d5-03ac-49ed-9af4-13006078d6db\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694808 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-serving-cert\") pod \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694886 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx7g9\" (UniqueName: \"kubernetes.io/projected/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-kube-api-access-hx7g9\") pod \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694918 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-client-ca\") pod \"52f451d5-03ac-49ed-9af4-13006078d6db\" (UID: \"52f451d5-03ac-49ed-9af4-13006078d6db\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694936 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-proxy-ca-bundles\") pod \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.694962 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-config\") pod \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\" (UID: \"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705\") " Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.695510 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-client-ca" (OuterVolumeSpecName: "client-ca") pod "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" (UID: "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.695566 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-config" (OuterVolumeSpecName: "config") pod "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" (UID: "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.695880 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-client-ca" (OuterVolumeSpecName: "client-ca") pod "52f451d5-03ac-49ed-9af4-13006078d6db" (UID: "52f451d5-03ac-49ed-9af4-13006078d6db"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.696404 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" (UID: "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.696429 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-config" (OuterVolumeSpecName: "config") pod "52f451d5-03ac-49ed-9af4-13006078d6db" (UID: "52f451d5-03ac-49ed-9af4-13006078d6db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.699710 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f451d5-03ac-49ed-9af4-13006078d6db-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52f451d5-03ac-49ed-9af4-13006078d6db" (UID: "52f451d5-03ac-49ed-9af4-13006078d6db"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.699847 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f451d5-03ac-49ed-9af4-13006078d6db-kube-api-access-4qpsr" (OuterVolumeSpecName: "kube-api-access-4qpsr") pod "52f451d5-03ac-49ed-9af4-13006078d6db" (UID: "52f451d5-03ac-49ed-9af4-13006078d6db"). InnerVolumeSpecName "kube-api-access-4qpsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.700140 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-kube-api-access-hx7g9" (OuterVolumeSpecName: "kube-api-access-hx7g9") pod "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" (UID: "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705"). InnerVolumeSpecName "kube-api-access-hx7g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.703793 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" (UID: "cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.734389 4725 generic.go:334] "Generic (PLEG): container finished" podID="52f451d5-03ac-49ed-9af4-13006078d6db" containerID="2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0" exitCode=0 Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.734450 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" event={"ID":"52f451d5-03ac-49ed-9af4-13006078d6db","Type":"ContainerDied","Data":"2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0"} Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.734459 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.734475 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7" event={"ID":"52f451d5-03ac-49ed-9af4-13006078d6db","Type":"ContainerDied","Data":"cc098d8da65c2deebba4af1074252cd4b6c878c36ba9a8302225bdeedb27be4d"} Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.734493 4725 scope.go:117] "RemoveContainer" containerID="2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.736364 4725 generic.go:334] "Generic (PLEG): container finished" podID="cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" containerID="24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e" exitCode=0 Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.736390 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.736400 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" event={"ID":"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705","Type":"ContainerDied","Data":"24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e"} Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.736442 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6987695655-fxzbt" event={"ID":"cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705","Type":"ContainerDied","Data":"18098106a67b14df08901b04fae05dba5b49ba40e960b11736cf734acd53eda5"} Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.754293 4725 scope.go:117] "RemoveContainer" containerID="2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0" Feb 25 10:57:34 crc kubenswrapper[4725]: E0225 10:57:34.755645 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0\": container with ID starting with 2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0 not found: ID does not exist" containerID="2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.755679 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0"} err="failed to get container status \"2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0\": rpc error: code = NotFound desc = could not find container \"2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0\": container with ID starting with 2fb4c8dcc61337f7412f1f6611a0e994e2419ecce241f590380d86084774b2b0 not found: ID does not exist" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.755702 4725 scope.go:117] "RemoveContainer" containerID="24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.768337 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6987695655-fxzbt"] Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.771332 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6987695655-fxzbt"] Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.777449 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7"] Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.780923 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6748f969f4-m7rz7"] Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.782170 4725 scope.go:117] "RemoveContainer" containerID="24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e" Feb 25 10:57:34 crc kubenswrapper[4725]: E0225 10:57:34.782565 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e\": container with ID starting with 24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e not found: ID does not exist" containerID="24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.782599 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e"} err="failed to get container status \"24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e\": rpc error: code = NotFound desc = could not find container \"24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e\": container with ID starting with 24d1b1b682ba655d97506dd89ffce81fecc575fc293a362d84e56f9b4c8ef14e not found: ID does not exist" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796588 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx7g9\" (UniqueName: \"kubernetes.io/projected/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-kube-api-access-hx7g9\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796619 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796630 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796643 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796656 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qpsr\" (UniqueName: \"kubernetes.io/projected/52f451d5-03ac-49ed-9af4-13006078d6db-kube-api-access-4qpsr\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796666 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796678 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f451d5-03ac-49ed-9af4-13006078d6db-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796691 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f451d5-03ac-49ed-9af4-13006078d6db-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:34 crc kubenswrapper[4725]: I0225 10:57:34.796701 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.233162 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f451d5-03ac-49ed-9af4-13006078d6db" path="/var/lib/kubelet/pods/52f451d5-03ac-49ed-9af4-13006078d6db/volumes" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.233820 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" path="/var/lib/kubelet/pods/cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705/volumes" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.507913 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-674577d55b-kf647"] Feb 25 10:57:35 crc kubenswrapper[4725]: E0225 10:57:35.508147 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" containerName="controller-manager" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.508160 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" containerName="controller-manager" Feb 25 10:57:35 crc kubenswrapper[4725]: E0225 10:57:35.508174 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f451d5-03ac-49ed-9af4-13006078d6db" containerName="route-controller-manager" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.508182 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f451d5-03ac-49ed-9af4-13006078d6db" containerName="route-controller-manager" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.508315 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa4f6a9-4b0f-47a8-b61f-2e8d98ce4705" containerName="controller-manager" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.508329 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f451d5-03ac-49ed-9af4-13006078d6db" containerName="route-controller-manager" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.508770 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.515123 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.515321 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.515694 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.515994 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.516151 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.516211 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.520274 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf"] Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.522819 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.524461 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf"] Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.524756 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.529817 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-674577d55b-kf647"] Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.530721 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.531119 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.531504 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.531701 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.531779 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.532309 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607492 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55167746-1927-4807-a57c-1ea8da26a883-serving-cert\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607584 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-config\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-config\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607679 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efa5876-c9a6-4573-abc5-2dfb33360523-serving-cert\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607722 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9brc\" (UniqueName: \"kubernetes.io/projected/0efa5876-c9a6-4573-abc5-2dfb33360523-kube-api-access-b9brc\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607803 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-client-ca\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607903 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-client-ca\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.607946 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-proxy-ca-bundles\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.608014 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t95pl\" (UniqueName: \"kubernetes.io/projected/55167746-1927-4807-a57c-1ea8da26a883-kube-api-access-t95pl\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709413 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-client-ca\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709512 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-client-ca\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-proxy-ca-bundles\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709616 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t95pl\" (UniqueName: \"kubernetes.io/projected/55167746-1927-4807-a57c-1ea8da26a883-kube-api-access-t95pl\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709675 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-config\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709706 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55167746-1927-4807-a57c-1ea8da26a883-serving-cert\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709787 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-config\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709886 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efa5876-c9a6-4573-abc5-2dfb33360523-serving-cert\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.709935 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9brc\" (UniqueName: \"kubernetes.io/projected/0efa5876-c9a6-4573-abc5-2dfb33360523-kube-api-access-b9brc\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.710531 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-client-ca\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.711806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-config\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.711818 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-client-ca\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.712137 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-proxy-ca-bundles\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.712919 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-config\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.714998 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55167746-1927-4807-a57c-1ea8da26a883-serving-cert\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.715262 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efa5876-c9a6-4573-abc5-2dfb33360523-serving-cert\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.729720 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9brc\" (UniqueName: \"kubernetes.io/projected/0efa5876-c9a6-4573-abc5-2dfb33360523-kube-api-access-b9brc\") pod \"controller-manager-674577d55b-kf647\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.745492 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t95pl\" (UniqueName: \"kubernetes.io/projected/55167746-1927-4807-a57c-1ea8da26a883-kube-api-access-t95pl\") pod \"route-controller-manager-7d78c65587-k6gqf\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.838423 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:35 crc kubenswrapper[4725]: I0225 10:57:35.849309 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:36.266118 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf"] Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:36.755667 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" event={"ID":"55167746-1927-4807-a57c-1ea8da26a883","Type":"ContainerStarted","Data":"8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2"} Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:36.755719 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" event={"ID":"55167746-1927-4807-a57c-1ea8da26a883","Type":"ContainerStarted","Data":"3bb08942e43406bef47643287ca5db58299bb0900c86b5234239dc1e882140b6"} Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:36.755925 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:36.774845 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" podStartSLOduration=2.774814512 podStartE2EDuration="2.774814512s" podCreationTimestamp="2026-02-25 10:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:36.773493334 +0000 UTC m=+282.272075369" watchObservedRunningTime="2026-02-25 10:57:36.774814512 +0000 UTC m=+282.273396537" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:36.964190 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.047086 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.047133 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.120751 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.299005 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.299253 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.353020 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.366017 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-674577d55b-kf647"] Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.448800 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.449103 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.491600 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.678497 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.678563 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.729740 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.760854 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" event={"ID":"0efa5876-c9a6-4573-abc5-2dfb33360523","Type":"ContainerStarted","Data":"e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05"} Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.760904 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" event={"ID":"0efa5876-c9a6-4573-abc5-2dfb33360523","Type":"ContainerStarted","Data":"6c663ccc238a7503092014ab7d00455abd574aec50a5eca780424607dfb22fb8"} Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.791371 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" podStartSLOduration=4.791355523 podStartE2EDuration="4.791355523s" podCreationTimestamp="2026-02-25 10:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:37.788792019 +0000 UTC m=+283.287374054" watchObservedRunningTime="2026-02-25 10:57:37.791355523 +0000 UTC m=+283.289937548" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.809099 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.810061 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.810905 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:57:37 crc kubenswrapper[4725]: I0225 10:57:37.818112 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 10:57:38 crc kubenswrapper[4725]: I0225 10:57:38.765371 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:38 crc kubenswrapper[4725]: I0225 10:57:38.770064 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.201920 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.201985 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.251312 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.457339 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bq27c"] Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.603489 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.603565 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.661385 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dcstn"] Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.679572 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.775448 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dcstn" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="registry-server" containerID="cri-o://0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5" gracePeriod=2 Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.827527 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:57:39 crc kubenswrapper[4725]: I0225 10:57:39.827595 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.108653 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.274942 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxw9b\" (UniqueName: \"kubernetes.io/projected/47446d07-b5cf-4646-b54b-0e841fb3a662-kube-api-access-lxw9b\") pod \"47446d07-b5cf-4646-b54b-0e841fb3a662\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.275100 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-utilities\") pod \"47446d07-b5cf-4646-b54b-0e841fb3a662\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.275171 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-catalog-content\") pod \"47446d07-b5cf-4646-b54b-0e841fb3a662\" (UID: \"47446d07-b5cf-4646-b54b-0e841fb3a662\") " Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.276898 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-utilities" (OuterVolumeSpecName: "utilities") pod "47446d07-b5cf-4646-b54b-0e841fb3a662" (UID: "47446d07-b5cf-4646-b54b-0e841fb3a662"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.283034 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47446d07-b5cf-4646-b54b-0e841fb3a662-kube-api-access-lxw9b" (OuterVolumeSpecName: "kube-api-access-lxw9b") pod "47446d07-b5cf-4646-b54b-0e841fb3a662" (UID: "47446d07-b5cf-4646-b54b-0e841fb3a662"). InnerVolumeSpecName "kube-api-access-lxw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.333434 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47446d07-b5cf-4646-b54b-0e841fb3a662" (UID: "47446d07-b5cf-4646-b54b-0e841fb3a662"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.377407 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.377443 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47446d07-b5cf-4646-b54b-0e841fb3a662-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.377460 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxw9b\" (UniqueName: \"kubernetes.io/projected/47446d07-b5cf-4646-b54b-0e841fb3a662-kube-api-access-lxw9b\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.711041 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.711637 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.770272 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.780006 4725 generic.go:334] "Generic (PLEG): container finished" podID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerID="0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5" exitCode=0 Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.780103 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dcstn" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.780175 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dcstn" event={"ID":"47446d07-b5cf-4646-b54b-0e841fb3a662","Type":"ContainerDied","Data":"0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5"} Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.780236 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dcstn" event={"ID":"47446d07-b5cf-4646-b54b-0e841fb3a662","Type":"ContainerDied","Data":"2b697a7684e1e9acaba2496de7c01c17d5bf55cf5c5dff2c730a875468ffa4e0"} Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.780257 4725 scope.go:117] "RemoveContainer" containerID="0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.780589 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bq27c" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="registry-server" containerID="cri-o://92d7a0707f23c25b59c2f86679f48462f3ec1c333b56b23830e7b275ede09595" gracePeriod=2 Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.809744 4725 scope.go:117] "RemoveContainer" containerID="e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.817693 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dcstn"] Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.822569 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dcstn"] Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.833269 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.838812 4725 scope.go:117] "RemoveContainer" containerID="a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.856322 4725 scope.go:117] "RemoveContainer" containerID="0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5" Feb 25 10:57:40 crc kubenswrapper[4725]: E0225 10:57:40.857268 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5\": container with ID starting with 0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5 not found: ID does not exist" containerID="0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.857317 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5"} err="failed to get container status \"0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5\": rpc error: code = NotFound desc = could not find container \"0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5\": container with ID starting with 0d22cc2b612036d91cb456f8c5b22429ab900fe6f46a2c5d3b5308224d7fbab5 not found: ID does not exist" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.857349 4725 scope.go:117] "RemoveContainer" containerID="e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242" Feb 25 10:57:40 crc kubenswrapper[4725]: E0225 10:57:40.857733 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242\": container with ID starting with e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242 not found: ID does not exist" containerID="e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.857771 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242"} err="failed to get container status \"e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242\": rpc error: code = NotFound desc = could not find container \"e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242\": container with ID starting with e1485aac1b7a0e469f51b2d0a56ca3e1e7a6fba05acdf45316b2dee1bc9bb242 not found: ID does not exist" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.857792 4725 scope.go:117] "RemoveContainer" containerID="a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a" Feb 25 10:57:40 crc kubenswrapper[4725]: E0225 10:57:40.858242 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a\": container with ID starting with a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a not found: ID does not exist" containerID="a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a" Feb 25 10:57:40 crc kubenswrapper[4725]: I0225 10:57:40.858278 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a"} err="failed to get container status \"a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a\": rpc error: code = NotFound desc = could not find container \"a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a\": container with ID starting with a5f02942754648c239aca16b4e7748db356bb82e0277a825549ef6e0da76fc4a not found: ID does not exist" Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.233097 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" path="/var/lib/kubelet/pods/47446d07-b5cf-4646-b54b-0e841fb3a662/volumes" Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.556182 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.556269 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.556348 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.557196 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.557293 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2" gracePeriod=600 Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.790062 4725 generic.go:334] "Generic (PLEG): container finished" podID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerID="92d7a0707f23c25b59c2f86679f48462f3ec1c333b56b23830e7b275ede09595" exitCode=0 Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.790264 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bq27c" event={"ID":"dec8f4b6-001e-4ce7-b6d4-55b197612a38","Type":"ContainerDied","Data":"92d7a0707f23c25b59c2f86679f48462f3ec1c333b56b23830e7b275ede09595"} Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.794441 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2" exitCode=0 Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.794976 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2"} Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.834113 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.862532 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7mt"] Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.862854 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gx7mt" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="registry-server" containerID="cri-o://7c4eb0b109b1a105a572b1bd97ac4f58436abb803aaddef36e3d17c134f4ff7c" gracePeriod=2 Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.996689 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-catalog-content\") pod \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.997843 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-utilities\") pod \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.998261 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgnwr\" (UniqueName: \"kubernetes.io/projected/dec8f4b6-001e-4ce7-b6d4-55b197612a38-kube-api-access-vgnwr\") pod \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\" (UID: \"dec8f4b6-001e-4ce7-b6d4-55b197612a38\") " Feb 25 10:57:41 crc kubenswrapper[4725]: I0225 10:57:41.998985 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-utilities" (OuterVolumeSpecName: "utilities") pod "dec8f4b6-001e-4ce7-b6d4-55b197612a38" (UID: "dec8f4b6-001e-4ce7-b6d4-55b197612a38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.002559 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec8f4b6-001e-4ce7-b6d4-55b197612a38-kube-api-access-vgnwr" (OuterVolumeSpecName: "kube-api-access-vgnwr") pod "dec8f4b6-001e-4ce7-b6d4-55b197612a38" (UID: "dec8f4b6-001e-4ce7-b6d4-55b197612a38"). InnerVolumeSpecName "kube-api-access-vgnwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.053060 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dec8f4b6-001e-4ce7-b6d4-55b197612a38" (UID: "dec8f4b6-001e-4ce7-b6d4-55b197612a38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.100655 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.100808 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec8f4b6-001e-4ce7-b6d4-55b197612a38-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.101099 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgnwr\" (UniqueName: \"kubernetes.io/projected/dec8f4b6-001e-4ce7-b6d4-55b197612a38-kube-api-access-vgnwr\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.804261 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bq27c" event={"ID":"dec8f4b6-001e-4ce7-b6d4-55b197612a38","Type":"ContainerDied","Data":"1519413a92799634ca8955c05617980b1b05090db7d4a10e7ee7a9642b3f5b7f"} Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.804654 4725 scope.go:117] "RemoveContainer" containerID="92d7a0707f23c25b59c2f86679f48462f3ec1c333b56b23830e7b275ede09595" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.804373 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bq27c" Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.839860 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bq27c"] Feb 25 10:57:42 crc kubenswrapper[4725]: I0225 10:57:42.843686 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bq27c"] Feb 25 10:57:43 crc kubenswrapper[4725]: I0225 10:57:43.047264 4725 scope.go:117] "RemoveContainer" containerID="ba9566c85ccf85b3c88cb6213bad7ca072baa47e90d6bec4d0d19f114205da28" Feb 25 10:57:43 crc kubenswrapper[4725]: I0225 10:57:43.077117 4725 scope.go:117] "RemoveContainer" containerID="42fe0e5d5dc915f4da6c822faad13fa6c4c2db0c74f8f862a8cc877c89ed20c6" Feb 25 10:57:43 crc kubenswrapper[4725]: I0225 10:57:43.233683 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" path="/var/lib/kubelet/pods/dec8f4b6-001e-4ce7-b6d4-55b197612a38/volumes" Feb 25 10:57:43 crc kubenswrapper[4725]: I0225 10:57:43.813084 4725 generic.go:334] "Generic (PLEG): container finished" podID="8817d816-5958-4498-8a0d-528952c47e3a" containerID="7c4eb0b109b1a105a572b1bd97ac4f58436abb803aaddef36e3d17c134f4ff7c" exitCode=0 Feb 25 10:57:43 crc kubenswrapper[4725]: I0225 10:57:43.813149 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7mt" event={"ID":"8817d816-5958-4498-8a0d-528952c47e3a","Type":"ContainerDied","Data":"7c4eb0b109b1a105a572b1bd97ac4f58436abb803aaddef36e3d17c134f4ff7c"} Feb 25 10:57:43 crc kubenswrapper[4725]: I0225 10:57:43.815536 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"157d59ad935f84209be35b0af551db160cdd330b6d67a7efbf73190929b1d6c7"} Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.409240 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.429198 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-utilities\") pod \"8817d816-5958-4498-8a0d-528952c47e3a\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.429290 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-catalog-content\") pod \"8817d816-5958-4498-8a0d-528952c47e3a\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.429319 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b99hf\" (UniqueName: \"kubernetes.io/projected/8817d816-5958-4498-8a0d-528952c47e3a-kube-api-access-b99hf\") pod \"8817d816-5958-4498-8a0d-528952c47e3a\" (UID: \"8817d816-5958-4498-8a0d-528952c47e3a\") " Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.430714 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-utilities" (OuterVolumeSpecName: "utilities") pod "8817d816-5958-4498-8a0d-528952c47e3a" (UID: "8817d816-5958-4498-8a0d-528952c47e3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.435099 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8817d816-5958-4498-8a0d-528952c47e3a-kube-api-access-b99hf" (OuterVolumeSpecName: "kube-api-access-b99hf") pod "8817d816-5958-4498-8a0d-528952c47e3a" (UID: "8817d816-5958-4498-8a0d-528952c47e3a"). InnerVolumeSpecName "kube-api-access-b99hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.454599 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n87p9"] Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.454918 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n87p9" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="registry-server" containerID="cri-o://e0df7d9b0e7655104cce99e2802ab775c0b774e714c76d212a7984bbe56d267e" gracePeriod=2 Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.471195 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8817d816-5958-4498-8a0d-528952c47e3a" (UID: "8817d816-5958-4498-8a0d-528952c47e3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.530744 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.530783 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8817d816-5958-4498-8a0d-528952c47e3a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.530797 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b99hf\" (UniqueName: \"kubernetes.io/projected/8817d816-5958-4498-8a0d-528952c47e3a-kube-api-access-b99hf\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.828816 4725 generic.go:334] "Generic (PLEG): container finished" podID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerID="e0df7d9b0e7655104cce99e2802ab775c0b774e714c76d212a7984bbe56d267e" exitCode=0 Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.828901 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n87p9" event={"ID":"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5","Type":"ContainerDied","Data":"e0df7d9b0e7655104cce99e2802ab775c0b774e714c76d212a7984bbe56d267e"} Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.832116 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gx7mt" event={"ID":"8817d816-5958-4498-8a0d-528952c47e3a","Type":"ContainerDied","Data":"0bf8fb6d6399bf0bd81e2edd0e4c16c2d15baa064bd7336768739c3708859e01"} Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.832152 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gx7mt" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.832175 4725 scope.go:117] "RemoveContainer" containerID="7c4eb0b109b1a105a572b1bd97ac4f58436abb803aaddef36e3d17c134f4ff7c" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.856699 4725 scope.go:117] "RemoveContainer" containerID="5e49d4fb2df3a0793878af6b7d965fb5ea571752b9e83e47de9d3418b700c559" Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.887905 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7mt"] Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.892304 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gx7mt"] Feb 25 10:57:44 crc kubenswrapper[4725]: I0225 10:57:44.893728 4725 scope.go:117] "RemoveContainer" containerID="c7f8d31c04d7b33f5fc901be77864ef89d7f7eb17274e82626f90ad69d6fce5e" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.233757 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8817d816-5958-4498-8a0d-528952c47e3a" path="/var/lib/kubelet/pods/8817d816-5958-4498-8a0d-528952c47e3a/volumes" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.373865 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.547851 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcj2n\" (UniqueName: \"kubernetes.io/projected/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-kube-api-access-qcj2n\") pod \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.548010 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-utilities\") pod \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.548090 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-catalog-content\") pod \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\" (UID: \"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5\") " Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.550129 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-utilities" (OuterVolumeSpecName: "utilities") pod "c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" (UID: "c934ca68-7c23-4a8f-8e09-8d3edad1e1a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.555446 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-kube-api-access-qcj2n" (OuterVolumeSpecName: "kube-api-access-qcj2n") pod "c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" (UID: "c934ca68-7c23-4a8f-8e09-8d3edad1e1a5"). InnerVolumeSpecName "kube-api-access-qcj2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.651180 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.651241 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcj2n\" (UniqueName: \"kubernetes.io/projected/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-kube-api-access-qcj2n\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.700521 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" (UID: "c934ca68-7c23-4a8f-8e09-8d3edad1e1a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.752373 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.841680 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n87p9" event={"ID":"c934ca68-7c23-4a8f-8e09-8d3edad1e1a5","Type":"ContainerDied","Data":"a640e3a88556dc6a44446a0281d8f17961ccf83cc5b57cd13c94fbd72e70f8a9"} Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.841728 4725 scope.go:117] "RemoveContainer" containerID="e0df7d9b0e7655104cce99e2802ab775c0b774e714c76d212a7984bbe56d267e" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.841862 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n87p9" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.869954 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n87p9"] Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.872217 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n87p9"] Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.879099 4725 scope.go:117] "RemoveContainer" containerID="af25536dc547144a10c7873e8381c55f1457684a63749a97a3083bc4226383e6" Feb 25 10:57:45 crc kubenswrapper[4725]: I0225 10:57:45.894795 4725 scope.go:117] "RemoveContainer" containerID="e42f7faa476af5b1ba7fa3576b326f62838eefccb9017c95f10783b7498fbe06" Feb 25 10:57:47 crc kubenswrapper[4725]: I0225 10:57:47.230152 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" path="/var/lib/kubelet/pods/c934ca68-7c23-4a8f-8e09-8d3edad1e1a5/volumes" Feb 25 10:57:53 crc kubenswrapper[4725]: I0225 10:57:53.979793 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-674577d55b-kf647"] Feb 25 10:57:53 crc kubenswrapper[4725]: I0225 10:57:53.980556 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" podUID="0efa5876-c9a6-4573-abc5-2dfb33360523" containerName="controller-manager" containerID="cri-o://e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05" gracePeriod=30 Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.088629 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf"] Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.088888 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" podUID="55167746-1927-4807-a57c-1ea8da26a883" containerName="route-controller-manager" containerID="cri-o://8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2" gracePeriod=30 Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.667924 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.697007 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-config\") pod \"55167746-1927-4807-a57c-1ea8da26a883\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.697096 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55167746-1927-4807-a57c-1ea8da26a883-serving-cert\") pod \"55167746-1927-4807-a57c-1ea8da26a883\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.697134 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t95pl\" (UniqueName: \"kubernetes.io/projected/55167746-1927-4807-a57c-1ea8da26a883-kube-api-access-t95pl\") pod \"55167746-1927-4807-a57c-1ea8da26a883\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.697219 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-client-ca\") pod \"55167746-1927-4807-a57c-1ea8da26a883\" (UID: \"55167746-1927-4807-a57c-1ea8da26a883\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.697846 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-client-ca" (OuterVolumeSpecName: "client-ca") pod "55167746-1927-4807-a57c-1ea8da26a883" (UID: "55167746-1927-4807-a57c-1ea8da26a883"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.698281 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-config" (OuterVolumeSpecName: "config") pod "55167746-1927-4807-a57c-1ea8da26a883" (UID: "55167746-1927-4807-a57c-1ea8da26a883"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.702922 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55167746-1927-4807-a57c-1ea8da26a883-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "55167746-1927-4807-a57c-1ea8da26a883" (UID: "55167746-1927-4807-a57c-1ea8da26a883"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.706995 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55167746-1927-4807-a57c-1ea8da26a883-kube-api-access-t95pl" (OuterVolumeSpecName: "kube-api-access-t95pl") pod "55167746-1927-4807-a57c-1ea8da26a883" (UID: "55167746-1927-4807-a57c-1ea8da26a883"). InnerVolumeSpecName "kube-api-access-t95pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.725116 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.798594 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efa5876-c9a6-4573-abc5-2dfb33360523-serving-cert\") pod \"0efa5876-c9a6-4573-abc5-2dfb33360523\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.798652 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-client-ca\") pod \"0efa5876-c9a6-4573-abc5-2dfb33360523\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.798717 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-proxy-ca-bundles\") pod \"0efa5876-c9a6-4573-abc5-2dfb33360523\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.798791 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-config\") pod \"0efa5876-c9a6-4573-abc5-2dfb33360523\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.798845 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9brc\" (UniqueName: \"kubernetes.io/projected/0efa5876-c9a6-4573-abc5-2dfb33360523-kube-api-access-b9brc\") pod \"0efa5876-c9a6-4573-abc5-2dfb33360523\" (UID: \"0efa5876-c9a6-4573-abc5-2dfb33360523\") " Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.799167 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.799190 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55167746-1927-4807-a57c-1ea8da26a883-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.799202 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55167746-1927-4807-a57c-1ea8da26a883-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.799214 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t95pl\" (UniqueName: \"kubernetes.io/projected/55167746-1927-4807-a57c-1ea8da26a883-kube-api-access-t95pl\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.799618 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0efa5876-c9a6-4573-abc5-2dfb33360523" (UID: "0efa5876-c9a6-4573-abc5-2dfb33360523"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.799731 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-client-ca" (OuterVolumeSpecName: "client-ca") pod "0efa5876-c9a6-4573-abc5-2dfb33360523" (UID: "0efa5876-c9a6-4573-abc5-2dfb33360523"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.799812 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-config" (OuterVolumeSpecName: "config") pod "0efa5876-c9a6-4573-abc5-2dfb33360523" (UID: "0efa5876-c9a6-4573-abc5-2dfb33360523"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.802027 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efa5876-c9a6-4573-abc5-2dfb33360523-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0efa5876-c9a6-4573-abc5-2dfb33360523" (UID: "0efa5876-c9a6-4573-abc5-2dfb33360523"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.802758 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0efa5876-c9a6-4573-abc5-2dfb33360523-kube-api-access-b9brc" (OuterVolumeSpecName: "kube-api-access-b9brc") pod "0efa5876-c9a6-4573-abc5-2dfb33360523" (UID: "0efa5876-c9a6-4573-abc5-2dfb33360523"). InnerVolumeSpecName "kube-api-access-b9brc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.900002 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efa5876-c9a6-4573-abc5-2dfb33360523-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.900049 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.900061 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.900085 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efa5876-c9a6-4573-abc5-2dfb33360523-config\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.900099 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9brc\" (UniqueName: \"kubernetes.io/projected/0efa5876-c9a6-4573-abc5-2dfb33360523-kube-api-access-b9brc\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.904880 4725 generic.go:334] "Generic (PLEG): container finished" podID="55167746-1927-4807-a57c-1ea8da26a883" containerID="8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2" exitCode=0 Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.904921 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.904946 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" event={"ID":"55167746-1927-4807-a57c-1ea8da26a883","Type":"ContainerDied","Data":"8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2"} Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.904972 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf" event={"ID":"55167746-1927-4807-a57c-1ea8da26a883","Type":"ContainerDied","Data":"3bb08942e43406bef47643287ca5db58299bb0900c86b5234239dc1e882140b6"} Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.904987 4725 scope.go:117] "RemoveContainer" containerID="8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.906207 4725 generic.go:334] "Generic (PLEG): container finished" podID="0efa5876-c9a6-4573-abc5-2dfb33360523" containerID="e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05" exitCode=0 Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.906230 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" event={"ID":"0efa5876-c9a6-4573-abc5-2dfb33360523","Type":"ContainerDied","Data":"e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05"} Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.906246 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" event={"ID":"0efa5876-c9a6-4573-abc5-2dfb33360523","Type":"ContainerDied","Data":"6c663ccc238a7503092014ab7d00455abd574aec50a5eca780424607dfb22fb8"} Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.906312 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674577d55b-kf647" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.919948 4725 scope.go:117] "RemoveContainer" containerID="8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2" Feb 25 10:57:54 crc kubenswrapper[4725]: E0225 10:57:54.920335 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2\": container with ID starting with 8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2 not found: ID does not exist" containerID="8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.920386 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2"} err="failed to get container status \"8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2\": rpc error: code = NotFound desc = could not find container \"8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2\": container with ID starting with 8bce12e6be15f25a9453b04ea66465cbee55451d13b64918128514563532f1e2 not found: ID does not exist" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.920418 4725 scope.go:117] "RemoveContainer" containerID="e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.928634 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" podUID="24bebe29-933d-4461-8aab-b7d17e815781" containerName="oauth-openshift" containerID="cri-o://c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3" gracePeriod=15 Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.933477 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf"] Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.936396 4725 scope.go:117] "RemoveContainer" containerID="e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.939591 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78c65587-k6gqf"] Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.945024 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-674577d55b-kf647"] Feb 25 10:57:54 crc kubenswrapper[4725]: E0225 10:57:54.936965 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05\": container with ID starting with e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05 not found: ID does not exist" containerID="e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.945153 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05"} err="failed to get container status \"e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05\": rpc error: code = NotFound desc = could not find container \"e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05\": container with ID starting with e13e75ac5379db7edafacc83e260a43c67ef540a9a6df2222ceaa6e35294db05 not found: ID does not exist" Feb 25 10:57:54 crc kubenswrapper[4725]: I0225 10:57:54.947512 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-674577d55b-kf647"] Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.231313 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0efa5876-c9a6-4573-abc5-2dfb33360523" path="/var/lib/kubelet/pods/0efa5876-c9a6-4573-abc5-2dfb33360523/volumes" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.231792 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55167746-1927-4807-a57c-1ea8da26a883" path="/var/lib/kubelet/pods/55167746-1927-4807-a57c-1ea8da26a883/volumes" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517399 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46"] Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517625 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517637 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517651 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517658 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517665 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517671 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517681 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517687 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517694 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517699 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517707 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517712 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517719 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517725 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517732 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517739 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517745 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517751 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517760 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efa5876-c9a6-4573-abc5-2dfb33360523" containerName="controller-manager" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517767 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efa5876-c9a6-4573-abc5-2dfb33360523" containerName="controller-manager" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517777 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55167746-1927-4807-a57c-1ea8da26a883" containerName="route-controller-manager" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517784 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="55167746-1927-4807-a57c-1ea8da26a883" containerName="route-controller-manager" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517793 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517799 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="extract-utilities" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517806 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517811 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="extract-content" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.517821 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517841 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517945 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8817d816-5958-4498-8a0d-528952c47e3a" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517956 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="47446d07-b5cf-4646-b54b-0e841fb3a662" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517963 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec8f4b6-001e-4ce7-b6d4-55b197612a38" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517973 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="55167746-1927-4807-a57c-1ea8da26a883" containerName="route-controller-manager" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517989 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0efa5876-c9a6-4573-abc5-2dfb33360523" containerName="controller-manager" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.517997 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c934ca68-7c23-4a8f-8e09-8d3edad1e1a5" containerName="registry-server" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.518370 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.520838 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.520906 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.520927 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.521218 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.521686 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.522106 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d"] Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.523053 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.523500 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.530527 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.530532 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.530762 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.530816 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.531274 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.532478 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.533615 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.535801 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46"] Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.543944 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d"] Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.608919 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-client-ca\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.608990 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba3caad-8456-4418-9f30-b6e3557c5bd9-config\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.609016 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fba3caad-8456-4418-9f30-b6e3557c5bd9-client-ca\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.609194 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba3caad-8456-4418-9f30-b6e3557c5bd9-serving-cert\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.609265 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxwx\" (UniqueName: \"kubernetes.io/projected/fba3caad-8456-4418-9f30-b6e3557c5bd9-kube-api-access-vnxwx\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.609320 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-config\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.609351 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-proxy-ca-bundles\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.609416 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994975de-aa99-41f5-aa6e-aed1664babd4-serving-cert\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.609463 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcpwj\" (UniqueName: \"kubernetes.io/projected/994975de-aa99-41f5-aa6e-aed1664babd4-kube-api-access-vcpwj\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716020 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcpwj\" (UniqueName: \"kubernetes.io/projected/994975de-aa99-41f5-aa6e-aed1664babd4-kube-api-access-vcpwj\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716078 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-client-ca\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716124 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba3caad-8456-4418-9f30-b6e3557c5bd9-config\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716146 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fba3caad-8456-4418-9f30-b6e3557c5bd9-client-ca\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716212 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba3caad-8456-4418-9f30-b6e3557c5bd9-serving-cert\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716242 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnxwx\" (UniqueName: \"kubernetes.io/projected/fba3caad-8456-4418-9f30-b6e3557c5bd9-kube-api-access-vnxwx\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716279 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-config\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716306 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-proxy-ca-bundles\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.716344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994975de-aa99-41f5-aa6e-aed1664babd4-serving-cert\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.718063 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fba3caad-8456-4418-9f30-b6e3557c5bd9-client-ca\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.718991 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-client-ca\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.720104 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-config\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.721079 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/994975de-aa99-41f5-aa6e-aed1664babd4-proxy-ca-bundles\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.726634 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba3caad-8456-4418-9f30-b6e3557c5bd9-serving-cert\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.727183 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994975de-aa99-41f5-aa6e-aed1664babd4-serving-cert\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.728246 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba3caad-8456-4418-9f30-b6e3557c5bd9-config\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.734572 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcpwj\" (UniqueName: \"kubernetes.io/projected/994975de-aa99-41f5-aa6e-aed1664babd4-kube-api-access-vcpwj\") pod \"controller-manager-6cdbfbbb9f-fxq46\" (UID: \"994975de-aa99-41f5-aa6e-aed1664babd4\") " pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.735245 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnxwx\" (UniqueName: \"kubernetes.io/projected/fba3caad-8456-4418-9f30-b6e3557c5bd9-kube-api-access-vnxwx\") pod \"route-controller-manager-945f67457-4sx4d\" (UID: \"fba3caad-8456-4418-9f30-b6e3557c5bd9\") " pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.832341 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.853656 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.891128 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.919254 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-error\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920108 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-provider-selection\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920140 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2gvm\" (UniqueName: \"kubernetes.io/projected/24bebe29-933d-4461-8aab-b7d17e815781-kube-api-access-r2gvm\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920623 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-ocp-branding-template\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920648 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-login\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920720 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-cliconfig\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920746 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-service-ca\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920822 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-trusted-ca-bundle\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920872 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24bebe29-933d-4461-8aab-b7d17e815781-audit-dir\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920900 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-idp-0-file-data\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920928 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-audit-policies\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.920950 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-serving-cert\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.921323 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-session\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.921347 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-router-certs\") pod \"24bebe29-933d-4461-8aab-b7d17e815781\" (UID: \"24bebe29-933d-4461-8aab-b7d17e815781\") " Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.925923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.926181 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.926654 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.926893 4725 generic.go:334] "Generic (PLEG): container finished" podID="24bebe29-933d-4461-8aab-b7d17e815781" containerID="c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3" exitCode=0 Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.926959 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" event={"ID":"24bebe29-933d-4461-8aab-b7d17e815781","Type":"ContainerDied","Data":"c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3"} Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.926988 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" event={"ID":"24bebe29-933d-4461-8aab-b7d17e815781","Type":"ContainerDied","Data":"d30fffcbdbaaf02fe2d40032c7e8b59a7420fd4a0ff9c0ecd61b48c426f360da"} Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.927008 4725 scope.go:117] "RemoveContainer" containerID="c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.927115 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6trwd" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.927611 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.930255 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24bebe29-933d-4461-8aab-b7d17e815781-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.930688 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.952470 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bebe29-933d-4461-8aab-b7d17e815781-kube-api-access-r2gvm" (OuterVolumeSpecName: "kube-api-access-r2gvm") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "kube-api-access-r2gvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.952351 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.953699 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.955334 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.957695 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.960923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.968377 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.968611 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "24bebe29-933d-4461-8aab-b7d17e815781" (UID: "24bebe29-933d-4461-8aab-b7d17e815781"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.984246 4725 scope.go:117] "RemoveContainer" containerID="c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3" Feb 25 10:57:55 crc kubenswrapper[4725]: E0225 10:57:55.994277 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3\": container with ID starting with c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3 not found: ID does not exist" containerID="c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3" Feb 25 10:57:55 crc kubenswrapper[4725]: I0225 10:57:55.994324 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3"} err="failed to get container status \"c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3\": rpc error: code = NotFound desc = could not find container \"c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3\": container with ID starting with c4f8775cc7b1ab71d9198500535d7cfda978a7a60830132248e4aa80360a32b3 not found: ID does not exist" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024046 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024086 4725 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24bebe29-933d-4461-8aab-b7d17e815781-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024102 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024118 4725 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024131 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024143 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024157 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024168 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024180 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024193 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2gvm\" (UniqueName: \"kubernetes.io/projected/24bebe29-933d-4461-8aab-b7d17e815781-kube-api-access-r2gvm\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024205 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024217 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024228 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.024239 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24bebe29-933d-4461-8aab-b7d17e815781-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.125302 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46"] Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.223325 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d"] Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.269252 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6trwd"] Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.273007 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6trwd"] Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.937130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" event={"ID":"994975de-aa99-41f5-aa6e-aed1664babd4","Type":"ContainerStarted","Data":"e4db2d8d19cd8e80ad6b3058b8030900e28fb0099635e95f990a1309b2f93109"} Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.937444 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" event={"ID":"994975de-aa99-41f5-aa6e-aed1664babd4","Type":"ContainerStarted","Data":"b8c9a8b1a9057dbdd1798eba7376fbf4a849d7546a228f7ec5edfcd402794bc4"} Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.938770 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.940907 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" event={"ID":"fba3caad-8456-4418-9f30-b6e3557c5bd9","Type":"ContainerStarted","Data":"a28de7c284972b0b8bbd42d33cdeb92567c1e4d9673bb85f84e4a924c5555cf4"} Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.940948 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" event={"ID":"fba3caad-8456-4418-9f30-b6e3557c5bd9","Type":"ContainerStarted","Data":"5c1ea2410b4041cbbed2f9e0e9386ea4958b4912048c3842828128ca051d8196"} Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.946770 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" Feb 25 10:57:56 crc kubenswrapper[4725]: I0225 10:57:56.958247 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cdbfbbb9f-fxq46" podStartSLOduration=3.958215038 podStartE2EDuration="3.958215038s" podCreationTimestamp="2026-02-25 10:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:56.955331934 +0000 UTC m=+302.453913959" watchObservedRunningTime="2026-02-25 10:57:56.958215038 +0000 UTC m=+302.456797073" Feb 25 10:57:57 crc kubenswrapper[4725]: I0225 10:57:57.024158 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" podStartSLOduration=3.024131909 podStartE2EDuration="3.024131909s" podCreationTimestamp="2026-02-25 10:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:57:56.990298328 +0000 UTC m=+302.488880373" watchObservedRunningTime="2026-02-25 10:57:57.024131909 +0000 UTC m=+302.522713934" Feb 25 10:57:57 crc kubenswrapper[4725]: I0225 10:57:57.231330 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bebe29-933d-4461-8aab-b7d17e815781" path="/var/lib/kubelet/pods/24bebe29-933d-4461-8aab-b7d17e815781/volumes" Feb 25 10:57:57 crc kubenswrapper[4725]: I0225 10:57:57.947495 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:57 crc kubenswrapper[4725]: I0225 10:57:57.952879 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-945f67457-4sx4d" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.528027 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-968f9d98-mr8ng"] Feb 25 10:57:59 crc kubenswrapper[4725]: E0225 10:57:59.528244 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bebe29-933d-4461-8aab-b7d17e815781" containerName="oauth-openshift" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.528255 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bebe29-933d-4461-8aab-b7d17e815781" containerName="oauth-openshift" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.528335 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bebe29-933d-4461-8aab-b7d17e815781" containerName="oauth-openshift" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.528693 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.534004 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.534602 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.534781 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.535121 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.535289 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.535496 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.536410 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.536593 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.537002 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.537300 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.537517 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.538238 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.540723 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-968f9d98-mr8ng"] Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.554121 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.554905 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.562408 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572085 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-router-certs\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572139 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572200 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-service-ca\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572223 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-serving-cert\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572242 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhfh\" (UniqueName: \"kubernetes.io/projected/513f082d-4861-4d76-9835-92ea65739311-kube-api-access-7hhfh\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572262 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572286 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/513f082d-4861-4d76-9835-92ea65739311-audit-dir\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572304 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-audit-policies\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572318 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-session\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572343 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-login\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572366 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-cliconfig\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572468 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-error\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572524 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.572589 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673352 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/513f082d-4861-4d76-9835-92ea65739311-audit-dir\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673407 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-audit-policies\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673429 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-session\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673467 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-login\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673487 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-cliconfig\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673524 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-error\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673523 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/513f082d-4861-4d76-9835-92ea65739311-audit-dir\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673545 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673608 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673679 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-router-certs\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673716 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673804 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-service-ca\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673888 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-serving-cert\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673929 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhfh\" (UniqueName: \"kubernetes.io/projected/513f082d-4861-4d76-9835-92ea65739311-kube-api-access-7hhfh\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.673961 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.674806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.675138 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-audit-policies\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.675395 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-cliconfig\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.675417 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-service-ca\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.680487 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-serving-cert\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.680987 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.682768 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-router-certs\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.682918 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-session\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.683465 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-error\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.684317 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-template-login\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.684474 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.687218 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/513f082d-4861-4d76-9835-92ea65739311-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.701346 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhfh\" (UniqueName: \"kubernetes.io/projected/513f082d-4861-4d76-9835-92ea65739311-kube-api-access-7hhfh\") pod \"oauth-openshift-968f9d98-mr8ng\" (UID: \"513f082d-4861-4d76-9835-92ea65739311\") " pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:57:59 crc kubenswrapper[4725]: I0225 10:57:59.862062 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.135429 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533618-tbgdb"] Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.136476 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.140338 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.140419 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.142077 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.155256 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533618-tbgdb"] Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.181424 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6dd\" (UniqueName: \"kubernetes.io/projected/1c413df5-7174-492a-8ab4-314e9be6bf83-kube-api-access-ns6dd\") pod \"auto-csr-approver-29533618-tbgdb\" (UID: \"1c413df5-7174-492a-8ab4-314e9be6bf83\") " pod="openshift-infra/auto-csr-approver-29533618-tbgdb" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.283492 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6dd\" (UniqueName: \"kubernetes.io/projected/1c413df5-7174-492a-8ab4-314e9be6bf83-kube-api-access-ns6dd\") pod \"auto-csr-approver-29533618-tbgdb\" (UID: \"1c413df5-7174-492a-8ab4-314e9be6bf83\") " pod="openshift-infra/auto-csr-approver-29533618-tbgdb" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.289650 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-968f9d98-mr8ng"] Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.310853 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6dd\" (UniqueName: \"kubernetes.io/projected/1c413df5-7174-492a-8ab4-314e9be6bf83-kube-api-access-ns6dd\") pod \"auto-csr-approver-29533618-tbgdb\" (UID: \"1c413df5-7174-492a-8ab4-314e9be6bf83\") " pod="openshift-infra/auto-csr-approver-29533618-tbgdb" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.452699 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.850165 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533618-tbgdb"] Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.963610 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" event={"ID":"1c413df5-7174-492a-8ab4-314e9be6bf83","Type":"ContainerStarted","Data":"f081935a4a4821ae23e1b6393af2c3c79199d9a56e72cebca0360601d77a7fdb"} Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.965331 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" event={"ID":"513f082d-4861-4d76-9835-92ea65739311","Type":"ContainerStarted","Data":"c04ee86752fca808a163becf9da446dadaae3194a12f1ce4b29b19fbca5e5672"} Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.965379 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" event={"ID":"513f082d-4861-4d76-9835-92ea65739311","Type":"ContainerStarted","Data":"5429db8f650c8967a25dcd7e16fdb7cf65d5129b1486d174f354c7d08dcb6894"} Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.965640 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:58:00 crc kubenswrapper[4725]: I0225 10:58:00.996621 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" podStartSLOduration=31.996603441 podStartE2EDuration="31.996603441s" podCreationTimestamp="2026-02-25 10:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:58:00.994493899 +0000 UTC m=+306.493076014" watchObservedRunningTime="2026-02-25 10:58:00.996603441 +0000 UTC m=+306.495185476" Feb 25 10:58:01 crc kubenswrapper[4725]: I0225 10:58:01.147299 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-968f9d98-mr8ng" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.097551 4725 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.099947 4725 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.100132 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.100187 4725 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.100684 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb" gracePeriod=15 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.100734 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32" gracePeriod=15 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.100729 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d" gracePeriod=15 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.100873 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f" gracePeriod=15 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.100888 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364" gracePeriod=15 Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.101179 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.101256 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.101313 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.101376 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.101506 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.101602 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.101658 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.101715 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.101782 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.102985 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.103167 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.103263 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.103345 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.103403 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.103475 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.103577 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.103648 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.103774 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.103969 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104057 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104123 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104242 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104319 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104379 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104432 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104497 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.104663 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104729 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.104916 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.105519 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.138640 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213231 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213274 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213330 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213382 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213419 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213446 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213494 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.213515 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314240 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314311 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314333 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314373 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314389 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314410 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314437 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314479 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314505 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314517 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314533 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314637 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314674 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314744 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314881 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.314928 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.435991 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:02 crc kubenswrapper[4725]: W0225 10:58:02.469073 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-98b2298e7d614e8eb259a7cc7c14938c0eb3592d64f8df16d7bd91f3ebba4d22 WatchSource:0}: Error finding container 98b2298e7d614e8eb259a7cc7c14938c0eb3592d64f8df16d7bd91f3ebba4d22: Status 404 returned error can't find the container with id 98b2298e7d614e8eb259a7cc7c14938c0eb3592d64f8df16d7bd91f3ebba4d22 Feb 25 10:58:02 crc kubenswrapper[4725]: E0225 10:58:02.474248 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18977825e0b071b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:58:02.472362416 +0000 UTC m=+307.970944441,LastTimestamp:2026-02-25 10:58:02.472362416 +0000 UTC m=+307.970944441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.980515 4725 generic.go:334] "Generic (PLEG): container finished" podID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" containerID="3a6ae950ef7ebf2cc7120ed1c9de1df1af2bf779fa5e5b6f4ad8fd216cf5a4ba" exitCode=0 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.980706 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76bf95fe-88cd-4f68-a0d4-a5059c8b666a","Type":"ContainerDied","Data":"3a6ae950ef7ebf2cc7120ed1c9de1df1af2bf779fa5e5b6f4ad8fd216cf5a4ba"} Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.982329 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.982770 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.983588 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1"} Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.983700 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"98b2298e7d614e8eb259a7cc7c14938c0eb3592d64f8df16d7bd91f3ebba4d22"} Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.984318 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.985256 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.986179 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.987864 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.989015 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364" exitCode=0 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.989050 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb" exitCode=0 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.989060 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d" exitCode=0 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.989070 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32" exitCode=2 Feb 25 10:58:02 crc kubenswrapper[4725]: I0225 10:58:02.989154 4725 scope.go:117] "RemoveContainer" containerID="437f366c18026febcbff0fc8173c784cd1d3cc41dedab2b1e75fbb84ad9bd6b2" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.005724 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.009542 4725 generic.go:334] "Generic (PLEG): container finished" podID="1c413df5-7174-492a-8ab4-314e9be6bf83" containerID="0891e915e9feff026f54347fefc15acd2e37f71aa7cc2b70deef1316d7f8786b" exitCode=0 Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.009597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" event={"ID":"1c413df5-7174-492a-8ab4-314e9be6bf83","Type":"ContainerDied","Data":"0891e915e9feff026f54347fefc15acd2e37f71aa7cc2b70deef1316d7f8786b"} Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.010672 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.011427 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.012161 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.466510 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.467306 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.467489 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.467695 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.472221 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.472985 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.473605 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.474031 4725 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.474246 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.474473 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.550588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.551129 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kubelet-dir\") pod \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.551393 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-var-lock\") pod \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.550771 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.551174 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "76bf95fe-88cd-4f68-a0d4-a5059c8b666a" (UID: "76bf95fe-88cd-4f68-a0d4-a5059c8b666a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.551445 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-var-lock" (OuterVolumeSpecName: "var-lock") pod "76bf95fe-88cd-4f68-a0d4-a5059c8b666a" (UID: "76bf95fe-88cd-4f68-a0d4-a5059c8b666a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.552144 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kube-api-access\") pod \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\" (UID: \"76bf95fe-88cd-4f68-a0d4-a5059c8b666a\") " Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.552389 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.552612 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.552474 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.552708 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.553662 4725 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.554082 4725 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.554318 4725 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.554461 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.554579 4725 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.560625 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "76bf95fe-88cd-4f68-a0d4-a5059c8b666a" (UID: "76bf95fe-88cd-4f68-a0d4-a5059c8b666a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:58:04 crc kubenswrapper[4725]: I0225 10:58:04.655761 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bf95fe-88cd-4f68-a0d4-a5059c8b666a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.020717 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.022219 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f" exitCode=0 Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.022371 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.022391 4725 scope.go:117] "RemoveContainer" containerID="76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.025163 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76bf95fe-88cd-4f68-a0d4-a5059c8b666a","Type":"ContainerDied","Data":"e4233e5fafca5a0c1a8cc99295315eedde85cfc51842b5302ef244ea761fe0b8"} Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.025229 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4233e5fafca5a0c1a8cc99295315eedde85cfc51842b5302ef244ea761fe0b8" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.025258 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.037225 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.037648 4725 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.038057 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.038509 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.041494 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.041859 4725 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.042327 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.042666 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.048074 4725 scope.go:117] "RemoveContainer" containerID="b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.064440 4725 scope.go:117] "RemoveContainer" containerID="41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.134514 4725 scope.go:117] "RemoveContainer" containerID="466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.153822 4725 scope.go:117] "RemoveContainer" containerID="7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.171619 4725 scope.go:117] "RemoveContainer" containerID="3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.194252 4725 scope.go:117] "RemoveContainer" containerID="76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364" Feb 25 10:58:05 crc kubenswrapper[4725]: E0225 10:58:05.194604 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\": container with ID starting with 76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364 not found: ID does not exist" containerID="76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.194633 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364"} err="failed to get container status \"76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\": rpc error: code = NotFound desc = could not find container \"76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364\": container with ID starting with 76fb5e67d9fb75179bd78dfd01ee80347daaf564564169f43908c0f8788d2364 not found: ID does not exist" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.194655 4725 scope.go:117] "RemoveContainer" containerID="b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb" Feb 25 10:58:05 crc kubenswrapper[4725]: E0225 10:58:05.195087 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\": container with ID starting with b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb not found: ID does not exist" containerID="b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.195103 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb"} err="failed to get container status \"b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\": rpc error: code = NotFound desc = could not find container \"b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb\": container with ID starting with b4be50f4a2b088901dfe4a1a2dacf961ab27a25eeae29c4a0a3de681e1d008fb not found: ID does not exist" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.195117 4725 scope.go:117] "RemoveContainer" containerID="41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d" Feb 25 10:58:05 crc kubenswrapper[4725]: E0225 10:58:05.195384 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\": container with ID starting with 41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d not found: ID does not exist" containerID="41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.195428 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d"} err="failed to get container status \"41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\": rpc error: code = NotFound desc = could not find container \"41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d\": container with ID starting with 41dfb85b0e5ea028800ac5361753a923ef16b912508ca009fe67bfc501a8700d not found: ID does not exist" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.195457 4725 scope.go:117] "RemoveContainer" containerID="466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32" Feb 25 10:58:05 crc kubenswrapper[4725]: E0225 10:58:05.197317 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\": container with ID starting with 466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32 not found: ID does not exist" containerID="466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.197343 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32"} err="failed to get container status \"466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\": rpc error: code = NotFound desc = could not find container \"466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32\": container with ID starting with 466383239cb4d9ad852b8529e0d5b9ca2dc8f1a1a537f70003282a9b5bc94b32 not found: ID does not exist" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.197355 4725 scope.go:117] "RemoveContainer" containerID="7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f" Feb 25 10:58:05 crc kubenswrapper[4725]: E0225 10:58:05.197612 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\": container with ID starting with 7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f not found: ID does not exist" containerID="7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.197641 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f"} err="failed to get container status \"7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\": rpc error: code = NotFound desc = could not find container \"7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f\": container with ID starting with 7b9fd90a796bc36949d8216d4a9b3e0a7cdb30504933fb77475ed128a0fcf88f not found: ID does not exist" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.197655 4725 scope.go:117] "RemoveContainer" containerID="3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e" Feb 25 10:58:05 crc kubenswrapper[4725]: E0225 10:58:05.197883 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\": container with ID starting with 3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e not found: ID does not exist" containerID="3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.197903 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e"} err="failed to get container status \"3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\": rpc error: code = NotFound desc = could not find container \"3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e\": container with ID starting with 3c0c78cc5643f0ea0463f67edb6941920574deb07d979ea8fda5fb51709a8b5e not found: ID does not exist" Feb 25 10:58:05 crc kubenswrapper[4725]: E0225 10:58:05.200168 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18977825e0b071b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 10:58:02.472362416 +0000 UTC m=+307.970944441,LastTimestamp:2026-02-25 10:58:02.472362416 +0000 UTC m=+307.970944441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.230988 4725 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.231229 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.231700 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.232152 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.235643 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.407374 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.407808 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.408077 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.408230 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.465607 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns6dd\" (UniqueName: \"kubernetes.io/projected/1c413df5-7174-492a-8ab4-314e9be6bf83-kube-api-access-ns6dd\") pod \"1c413df5-7174-492a-8ab4-314e9be6bf83\" (UID: \"1c413df5-7174-492a-8ab4-314e9be6bf83\") " Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.469748 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c413df5-7174-492a-8ab4-314e9be6bf83-kube-api-access-ns6dd" (OuterVolumeSpecName: "kube-api-access-ns6dd") pod "1c413df5-7174-492a-8ab4-314e9be6bf83" (UID: "1c413df5-7174-492a-8ab4-314e9be6bf83"). InnerVolumeSpecName "kube-api-access-ns6dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 10:58:05 crc kubenswrapper[4725]: I0225 10:58:05.567296 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns6dd\" (UniqueName: \"kubernetes.io/projected/1c413df5-7174-492a-8ab4-314e9be6bf83-kube-api-access-ns6dd\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:06 crc kubenswrapper[4725]: I0225 10:58:06.034697 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" event={"ID":"1c413df5-7174-492a-8ab4-314e9be6bf83","Type":"ContainerDied","Data":"f081935a4a4821ae23e1b6393af2c3c79199d9a56e72cebca0360601d77a7fdb"} Feb 25 10:58:06 crc kubenswrapper[4725]: I0225 10:58:06.034757 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f081935a4a4821ae23e1b6393af2c3c79199d9a56e72cebca0360601d77a7fdb" Feb 25 10:58:06 crc kubenswrapper[4725]: I0225 10:58:06.034734 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" Feb 25 10:58:06 crc kubenswrapper[4725]: I0225 10:58:06.056896 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:06 crc kubenswrapper[4725]: I0225 10:58:06.057565 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:06 crc kubenswrapper[4725]: I0225 10:58:06.058335 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:09 crc kubenswrapper[4725]: E0225 10:58:09.690553 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:09 crc kubenswrapper[4725]: E0225 10:58:09.691884 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:09 crc kubenswrapper[4725]: E0225 10:58:09.692291 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:09 crc kubenswrapper[4725]: E0225 10:58:09.692469 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:09 crc kubenswrapper[4725]: E0225 10:58:09.692670 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:09 crc kubenswrapper[4725]: I0225 10:58:09.692689 4725 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 25 10:58:09 crc kubenswrapper[4725]: E0225 10:58:09.692866 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Feb 25 10:58:09 crc kubenswrapper[4725]: E0225 10:58:09.893872 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Feb 25 10:58:10 crc kubenswrapper[4725]: E0225 10:58:10.295284 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Feb 25 10:58:11 crc kubenswrapper[4725]: E0225 10:58:11.096942 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Feb 25 10:58:12 crc kubenswrapper[4725]: E0225 10:58:12.698973 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="3.2s" Feb 25 10:58:13 crc kubenswrapper[4725]: I0225 10:58:13.223846 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:13 crc kubenswrapper[4725]: I0225 10:58:13.224597 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:13 crc kubenswrapper[4725]: I0225 10:58:13.225369 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:13 crc kubenswrapper[4725]: I0225 10:58:13.225914 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:13 crc kubenswrapper[4725]: I0225 10:58:13.238394 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:13 crc kubenswrapper[4725]: I0225 10:58:13.238515 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:13 crc kubenswrapper[4725]: E0225 10:58:13.238733 4725 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:13 crc kubenswrapper[4725]: I0225 10:58:13.239152 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.088481 4725 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="916c4cfc21fc10474cee2bcef2388d85252e5d2dac76fab08aa53901c3d32409" exitCode=0 Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.088576 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"916c4cfc21fc10474cee2bcef2388d85252e5d2dac76fab08aa53901c3d32409"} Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.088660 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"de665ac7e18182701bdb7806c6187a0ba86b85f6ce4e6c841043e35e15dcc8ae"} Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.089213 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.089249 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.089521 4725 status_manager.go:851] "Failed to get status for pod" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" pod="openshift-infra/auto-csr-approver-29533618-tbgdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29533618-tbgdb\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:14 crc kubenswrapper[4725]: E0225 10:58:14.089762 4725 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.089885 4725 status_manager.go:851] "Failed to get status for pod" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:14 crc kubenswrapper[4725]: I0225 10:58:14.090394 4725 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Feb 25 10:58:15 crc kubenswrapper[4725]: I0225 10:58:15.098223 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a929214246f3bc92d94bd08edc2093c95e4d061c6eb4aebcf090e1cb789c6b0b"} Feb 25 10:58:15 crc kubenswrapper[4725]: I0225 10:58:15.098568 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0916935b64811743c401ee4a4a6cdfc917b2075d215b7800cb2568da2954b7d7"} Feb 25 10:58:15 crc kubenswrapper[4725]: I0225 10:58:15.098584 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"78141248b5729cb2970fb5a05f531e852ecf00375329c32ce3469ea6f9d45fda"} Feb 25 10:58:15 crc kubenswrapper[4725]: I0225 10:58:15.098597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80cbad9241117a9a61abca9e671fe01d00d047c4b8cdef5e634cddf7ad574dfd"} Feb 25 10:58:16 crc kubenswrapper[4725]: I0225 10:58:16.106081 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c98e32ace4195d410689debee17f132932970b065d2c050d4ee7a0360bd65276"} Feb 25 10:58:16 crc kubenswrapper[4725]: I0225 10:58:16.106277 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:16 crc kubenswrapper[4725]: I0225 10:58:16.106395 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:16 crc kubenswrapper[4725]: I0225 10:58:16.106424 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:16 crc kubenswrapper[4725]: I0225 10:58:16.951217 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 25 10:58:16 crc kubenswrapper[4725]: I0225 10:58:16.951275 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 25 10:58:17 crc kubenswrapper[4725]: I0225 10:58:17.112736 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 10:58:17 crc kubenswrapper[4725]: I0225 10:58:17.113373 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 25 10:58:17 crc kubenswrapper[4725]: I0225 10:58:17.113422 4725 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae" exitCode=1 Feb 25 10:58:17 crc kubenswrapper[4725]: I0225 10:58:17.113450 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae"} Feb 25 10:58:17 crc kubenswrapper[4725]: I0225 10:58:17.113884 4725 scope.go:117] "RemoveContainer" containerID="7a83acb7f4e4c7bb5799e10e904db838ee4660637196f35f515620318fb764ae" Feb 25 10:58:18 crc kubenswrapper[4725]: I0225 10:58:18.128017 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 10:58:18 crc kubenswrapper[4725]: I0225 10:58:18.128910 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 25 10:58:18 crc kubenswrapper[4725]: I0225 10:58:18.128973 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa2cde7da058bc1bf322b0e98bcae9902ea3d2114fbfb7727e085233235f4147"} Feb 25 10:58:18 crc kubenswrapper[4725]: I0225 10:58:18.240180 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:18 crc kubenswrapper[4725]: I0225 10:58:18.240274 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:18 crc kubenswrapper[4725]: I0225 10:58:18.248573 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:20 crc kubenswrapper[4725]: I0225 10:58:20.552623 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.089399 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.089473 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.092150 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.092377 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.100744 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.117032 4725 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.120370 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.145107 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.145143 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.149331 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.150955 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ffdbc5d9-b71f-4a5b-b935-c48b6bc1614a" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.190919 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.190968 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.191040 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.192380 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.193270 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.203649 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.206114 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/708f426f-f477-476b-92eb-7ab94a133335-metrics-certs\") pod \"network-metrics-daemon-7k279\" (UID: \"708f426f-f477-476b-92eb-7ab94a133335\") " pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.214784 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.215255 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.247395 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.260152 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.271511 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.345124 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 25 10:58:21 crc kubenswrapper[4725]: I0225 10:58:21.353111 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7k279" Feb 25 10:58:21 crc kubenswrapper[4725]: W0225 10:58:21.686188 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-a372d3064bfc28d906d4a63b29704d61d0ca5b0c187948a61c86ccbd38f21702 WatchSource:0}: Error finding container a372d3064bfc28d906d4a63b29704d61d0ca5b0c187948a61c86ccbd38f21702: Status 404 returned error can't find the container with id a372d3064bfc28d906d4a63b29704d61d0ca5b0c187948a61c86ccbd38f21702 Feb 25 10:58:21 crc kubenswrapper[4725]: W0225 10:58:21.755410 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8b28af582dbbbd5d9753b1ee6b84502b911099883307c358d22459b9c3061bbd WatchSource:0}: Error finding container 8b28af582dbbbd5d9753b1ee6b84502b911099883307c358d22459b9c3061bbd: Status 404 returned error can't find the container with id 8b28af582dbbbd5d9753b1ee6b84502b911099883307c358d22459b9c3061bbd Feb 25 10:58:21 crc kubenswrapper[4725]: W0225 10:58:21.761650 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-49a1067f5fed39b33c442d32eed58cebba631f0293ed3fa42fb9dd405ef35981 WatchSource:0}: Error finding container 49a1067f5fed39b33c442d32eed58cebba631f0293ed3fa42fb9dd405ef35981: Status 404 returned error can't find the container with id 49a1067f5fed39b33c442d32eed58cebba631f0293ed3fa42fb9dd405ef35981 Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.153780 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e28e06ca7b70489baef8d77255ae3f9ffebccae21d1355d3a5b4b1f480acb4f2"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.154181 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"49a1067f5fed39b33c442d32eed58cebba631f0293ed3fa42fb9dd405ef35981"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.156025 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7k279" event={"ID":"708f426f-f477-476b-92eb-7ab94a133335","Type":"ContainerStarted","Data":"0e6fc5e4df245fd8bf58cd5eb3dbc98314056b62afcfddcd22c65a31e171d0f9"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.156067 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7k279" event={"ID":"708f426f-f477-476b-92eb-7ab94a133335","Type":"ContainerStarted","Data":"77a760c02a2c0ce8f403abe79ad77e6c9768068b89137a72d2f9ebee1684c321"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.158507 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"122cfce68108e92aec9f6fc05eb4c252329f4616ee2798b0b42dd7607b715861"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.158537 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8b28af582dbbbd5d9753b1ee6b84502b911099883307c358d22459b9c3061bbd"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.158779 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.161663 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1470c0e8af1b7f94e82ba2098fc2a32b9a0cbe7a40eb6c342c18b0d563c45d8b"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.161747 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a372d3064bfc28d906d4a63b29704d61d0ca5b0c187948a61c86ccbd38f21702"} Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.161774 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:22 crc kubenswrapper[4725]: I0225 10:58:22.161791 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0fd4a582-ec8c-4d92-af5f-9cda0a573098" Feb 25 10:58:23 crc kubenswrapper[4725]: I0225 10:58:23.169769 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7k279" event={"ID":"708f426f-f477-476b-92eb-7ab94a133335","Type":"ContainerStarted","Data":"670979c989b6199c154797dc03ea7a17d8ab35b49c86190e324bd006d1878b34"} Feb 25 10:58:24 crc kubenswrapper[4725]: I0225 10:58:24.190313 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 25 10:58:24 crc kubenswrapper[4725]: I0225 10:58:24.190394 4725 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="1470c0e8af1b7f94e82ba2098fc2a32b9a0cbe7a40eb6c342c18b0d563c45d8b" exitCode=255 Feb 25 10:58:24 crc kubenswrapper[4725]: I0225 10:58:24.190478 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"1470c0e8af1b7f94e82ba2098fc2a32b9a0cbe7a40eb6c342c18b0d563c45d8b"} Feb 25 10:58:24 crc kubenswrapper[4725]: I0225 10:58:24.191126 4725 scope.go:117] "RemoveContainer" containerID="1470c0e8af1b7f94e82ba2098fc2a32b9a0cbe7a40eb6c342c18b0d563c45d8b" Feb 25 10:58:25 crc kubenswrapper[4725]: I0225 10:58:25.252606 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ffdbc5d9-b71f-4a5b-b935-c48b6bc1614a" Feb 25 10:58:25 crc kubenswrapper[4725]: I0225 10:58:25.424424 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 25 10:58:25 crc kubenswrapper[4725]: I0225 10:58:25.424484 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d594e26b610552fe691f6a06a082ba95445db5962ea6178857f80a29a292a4b1"} Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.435497 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.437434 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.437729 4725 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="d594e26b610552fe691f6a06a082ba95445db5962ea6178857f80a29a292a4b1" exitCode=255 Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.437914 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"d594e26b610552fe691f6a06a082ba95445db5962ea6178857f80a29a292a4b1"} Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.438192 4725 scope.go:117] "RemoveContainer" containerID="1470c0e8af1b7f94e82ba2098fc2a32b9a0cbe7a40eb6c342c18b0d563c45d8b" Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.439238 4725 scope.go:117] "RemoveContainer" containerID="d594e26b610552fe691f6a06a082ba95445db5962ea6178857f80a29a292a4b1" Feb 25 10:58:26 crc kubenswrapper[4725]: E0225 10:58:26.439617 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.533277 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:58:26 crc kubenswrapper[4725]: I0225 10:58:26.539128 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:58:27 crc kubenswrapper[4725]: I0225 10:58:27.274661 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 25 10:58:27 crc kubenswrapper[4725]: I0225 10:58:27.448255 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 25 10:58:27 crc kubenswrapper[4725]: I0225 10:58:27.453772 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 10:58:27 crc kubenswrapper[4725]: I0225 10:58:27.455606 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 25 10:58:27 crc kubenswrapper[4725]: I0225 10:58:27.488055 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 25 10:58:29 crc kubenswrapper[4725]: I0225 10:58:29.079132 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 25 10:58:29 crc kubenswrapper[4725]: I0225 10:58:29.443253 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 25 10:58:30 crc kubenswrapper[4725]: I0225 10:58:30.666989 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 10:58:32 crc kubenswrapper[4725]: I0225 10:58:32.428884 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 25 10:58:32 crc kubenswrapper[4725]: I0225 10:58:32.602087 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 25 10:58:32 crc kubenswrapper[4725]: I0225 10:58:32.827953 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 25 10:58:33 crc kubenswrapper[4725]: I0225 10:58:33.185585 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 25 10:58:33 crc kubenswrapper[4725]: I0225 10:58:33.406332 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 25 10:58:33 crc kubenswrapper[4725]: I0225 10:58:33.457561 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 25 10:58:33 crc kubenswrapper[4725]: I0225 10:58:33.917036 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 25 10:58:34 crc kubenswrapper[4725]: I0225 10:58:34.088667 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 10:58:34 crc kubenswrapper[4725]: I0225 10:58:34.152890 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 25 10:58:34 crc kubenswrapper[4725]: I0225 10:58:34.318000 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 25 10:58:34 crc kubenswrapper[4725]: I0225 10:58:34.490556 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 25 10:58:34 crc kubenswrapper[4725]: I0225 10:58:34.778075 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 25 10:58:35 crc kubenswrapper[4725]: I0225 10:58:35.184031 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 25 10:58:35 crc kubenswrapper[4725]: I0225 10:58:35.418649 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 25 10:58:35 crc kubenswrapper[4725]: I0225 10:58:35.419069 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 25 10:58:35 crc kubenswrapper[4725]: I0225 10:58:35.430791 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 25 10:58:35 crc kubenswrapper[4725]: I0225 10:58:35.437424 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 25 10:58:35 crc kubenswrapper[4725]: I0225 10:58:35.930409 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 10:58:35 crc kubenswrapper[4725]: I0225 10:58:35.980137 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.074812 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.094969 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.114176 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.293071 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.344055 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.429486 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.447341 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.507901 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.676685 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.728781 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.801673 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.825768 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.828283 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 10:58:36 crc kubenswrapper[4725]: I0225 10:58:36.969920 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.096131 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.129582 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.202874 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.218107 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.226092 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.273629 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.282317 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.327497 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.377488 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.417383 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.438411 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.549330 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.618330 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.624042 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.702755 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.703518 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.711084 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.880285 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.921154 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 25 10:58:37 crc kubenswrapper[4725]: I0225 10:58:37.972527 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.095016 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.187065 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.200538 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.213663 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.224291 4725 scope.go:117] "RemoveContainer" containerID="d594e26b610552fe691f6a06a082ba95445db5962ea6178857f80a29a292a4b1" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.344126 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.410881 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.427181 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.518365 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.518724 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e64a7212adf70315ea7926289ef72f248beb8289b3a42be83ee649f6bcb1bbee"} Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.521120 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.616883 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.628312 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.685570 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.687421 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.717717 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.748473 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.819462 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.855182 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.880248 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.961467 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 10:58:38 crc kubenswrapper[4725]: I0225 10:58:38.992972 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.208994 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.412807 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.469968 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.520147 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.554580 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.574770 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.629241 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.638117 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.640193 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.687617 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.727524 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.767942 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.831910 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.872967 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.885303 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.940161 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.986707 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 25 10:58:39 crc kubenswrapper[4725]: I0225 10:58:39.987074 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.050745 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.113608 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.156819 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.237077 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.261128 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.323146 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.362994 4725 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.399303 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.438693 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.468915 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.481046 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.544806 4725 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.622375 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.656742 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.689299 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.758270 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 25 10:58:40 crc kubenswrapper[4725]: I0225 10:58:40.943609 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.015777 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.215757 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.229199 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.233298 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.322662 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.352285 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.353132 4725 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.355642 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7k279" podStartSLOduration=292.355628611 podStartE2EDuration="4m52.355628611s" podCreationTimestamp="2026-02-25 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:58:23.181193427 +0000 UTC m=+328.679775472" watchObservedRunningTime="2026-02-25 10:58:41.355628611 +0000 UTC m=+346.854210636" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.356469 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.356463185 podStartE2EDuration="39.356463185s" podCreationTimestamp="2026-02-25 10:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:58:20.912278285 +0000 UTC m=+326.410860330" watchObservedRunningTime="2026-02-25 10:58:41.356463185 +0000 UTC m=+346.855045210" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.358666 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.358715 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.358737 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7k279"] Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.382555 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.382524621 podStartE2EDuration="20.382524621s" podCreationTimestamp="2026-02-25 10:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 10:58:41.381021997 +0000 UTC m=+346.879604022" watchObservedRunningTime="2026-02-25 10:58:41.382524621 +0000 UTC m=+346.881106656" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.474313 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.486414 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.559296 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.588776 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.666475 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.688636 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.732298 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.753404 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.821610 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.857312 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.914671 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 10:58:41 crc kubenswrapper[4725]: I0225 10:58:41.940063 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.094504 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.103061 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.360228 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.407945 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.430949 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.439674 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.487850 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.519506 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.571336 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.574392 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.596225 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.604132 4725 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.622425 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.704415 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.720573 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.756579 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.781730 4725 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.785420 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.824176 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 25 10:58:42 crc kubenswrapper[4725]: I0225 10:58:42.824418 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.054071 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.164107 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.209100 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.214036 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.251131 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.351359 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.382255 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.468468 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.483931 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.491276 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.498919 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.541453 4725 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.541725 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1" gracePeriod=5 Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.577271 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.600917 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.675847 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 25 10:58:43 crc kubenswrapper[4725]: I0225 10:58:43.854279 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.002740 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.025079 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.029506 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.100704 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.143170 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.257481 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.305568 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.311128 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.339501 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.359958 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.368966 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.422979 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.496543 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.521560 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.524122 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.668779 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.698657 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.725496 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.737636 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.775086 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 25 10:58:44 crc kubenswrapper[4725]: I0225 10:58:44.947853 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.012196 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.028088 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.175019 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.175798 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.224393 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.370037 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.481648 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.482540 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.543949 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.622160 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.646687 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.669771 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.835204 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.853620 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.911076 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.928764 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 10:58:45 crc kubenswrapper[4725]: I0225 10:58:45.989455 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.041913 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.105159 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.143289 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.160640 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.357612 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.425498 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.514048 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.677584 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.690156 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.783167 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.798342 4725 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.846046 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 25 10:58:46 crc kubenswrapper[4725]: I0225 10:58:46.933916 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.045122 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.103861 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.229628 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.319183 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.321340 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.518492 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.550797 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.635186 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.641353 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.673216 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.686098 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 25 10:58:47 crc kubenswrapper[4725]: I0225 10:58:47.852926 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 25 10:58:48 crc kubenswrapper[4725]: I0225 10:58:48.070021 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 25 10:58:48 crc kubenswrapper[4725]: I0225 10:58:48.280083 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 25 10:58:48 crc kubenswrapper[4725]: I0225 10:58:48.388676 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 25 10:58:48 crc kubenswrapper[4725]: I0225 10:58:48.459438 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 25 10:58:48 crc kubenswrapper[4725]: I0225 10:58:48.599712 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 25 10:58:48 crc kubenswrapper[4725]: I0225 10:58:48.715698 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 25 10:58:48 crc kubenswrapper[4725]: I0225 10:58:48.995040 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.124173 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.128650 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.128728 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.139325 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.231997 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.249863 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.249910 4725 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="afba339f-703e-4536-91af-6fdcfec8d2ea" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.249939 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.249955 4725 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="afba339f-703e-4536-91af-6fdcfec8d2ea" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.272789 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.272915 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.272999 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273053 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273078 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273100 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273186 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273215 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273326 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273535 4725 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273558 4725 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273577 4725 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.273599 4725 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.287700 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.294394 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.375487 4725 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.581868 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.581928 4725 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1" exitCode=137 Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.581974 4725 scope.go:117] "RemoveContainer" containerID="e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.582058 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.582069 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.601334 4725 scope.go:117] "RemoveContainer" containerID="e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1" Feb 25 10:58:49 crc kubenswrapper[4725]: E0225 10:58:49.603962 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1\": container with ID starting with e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1 not found: ID does not exist" containerID="e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.604033 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1"} err="failed to get container status \"e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1\": rpc error: code = NotFound desc = could not find container \"e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1\": container with ID starting with e86dd0e487cf5fbc3f0287e40d454d256ba5c305618b63b5c897f63b796a4ac1 not found: ID does not exist" Feb 25 10:58:49 crc kubenswrapper[4725]: I0225 10:58:49.913337 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 25 10:58:50 crc kubenswrapper[4725]: I0225 10:58:50.645157 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 25 10:58:50 crc kubenswrapper[4725]: I0225 10:58:50.729516 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 25 10:58:50 crc kubenswrapper[4725]: I0225 10:58:50.826047 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 25 10:58:51 crc kubenswrapper[4725]: I0225 10:58:51.156261 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 10:58:51 crc kubenswrapper[4725]: I0225 10:58:51.227348 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 10:58:51 crc kubenswrapper[4725]: I0225 10:58:51.234120 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 25 10:58:51 crc kubenswrapper[4725]: I0225 10:58:51.266533 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 10:59:08 crc kubenswrapper[4725]: I0225 10:59:08.697643 4725 generic.go:334] "Generic (PLEG): container finished" podID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerID="0fba3923d377ead43ca148ced91266be82ac2d3f9cc0d9b2ed601dbe007189f5" exitCode=0 Feb 25 10:59:08 crc kubenswrapper[4725]: I0225 10:59:08.697725 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" event={"ID":"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474","Type":"ContainerDied","Data":"0fba3923d377ead43ca148ced91266be82ac2d3f9cc0d9b2ed601dbe007189f5"} Feb 25 10:59:08 crc kubenswrapper[4725]: I0225 10:59:08.699502 4725 scope.go:117] "RemoveContainer" containerID="0fba3923d377ead43ca148ced91266be82ac2d3f9cc0d9b2ed601dbe007189f5" Feb 25 10:59:09 crc kubenswrapper[4725]: I0225 10:59:09.707629 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" event={"ID":"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474","Type":"ContainerStarted","Data":"7d9de28b7aa8e7741cd8b9b386367e30114fcdb10953c364e00e85a634cca82b"} Feb 25 10:59:09 crc kubenswrapper[4725]: I0225 10:59:09.708493 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 10:59:09 crc kubenswrapper[4725]: I0225 10:59:09.710204 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.138283 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533620-79r87"] Feb 25 11:00:00 crc kubenswrapper[4725]: E0225 11:00:00.139180 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" containerName="oc" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.139195 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" containerName="oc" Feb 25 11:00:00 crc kubenswrapper[4725]: E0225 11:00:00.139217 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.139225 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 11:00:00 crc kubenswrapper[4725]: E0225 11:00:00.139234 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" containerName="installer" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.139241 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" containerName="installer" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.139344 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bf95fe-88cd-4f68-a0d4-a5059c8b666a" containerName="installer" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.139360 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.139373 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" containerName="oc" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.139799 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533620-79r87" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.143180 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.143290 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.143425 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.149244 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq"] Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.150079 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.156747 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.156870 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.159936 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533620-79r87"] Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.168756 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq"] Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.279241 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc7e258-78da-488a-923a-d133cc3a1d03-config-volume\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.279308 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc7e258-78da-488a-923a-d133cc3a1d03-secret-volume\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.279350 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4rks\" (UniqueName: \"kubernetes.io/projected/ffc7e258-78da-488a-923a-d133cc3a1d03-kube-api-access-k4rks\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.279810 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zc9c\" (UniqueName: \"kubernetes.io/projected/94932c77-7581-4291-bb30-55e751a0923c-kube-api-access-7zc9c\") pod \"auto-csr-approver-29533620-79r87\" (UID: \"94932c77-7581-4291-bb30-55e751a0923c\") " pod="openshift-infra/auto-csr-approver-29533620-79r87" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.380919 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zc9c\" (UniqueName: \"kubernetes.io/projected/94932c77-7581-4291-bb30-55e751a0923c-kube-api-access-7zc9c\") pod \"auto-csr-approver-29533620-79r87\" (UID: \"94932c77-7581-4291-bb30-55e751a0923c\") " pod="openshift-infra/auto-csr-approver-29533620-79r87" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.380980 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc7e258-78da-488a-923a-d133cc3a1d03-config-volume\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.381017 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc7e258-78da-488a-923a-d133cc3a1d03-secret-volume\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.381059 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4rks\" (UniqueName: \"kubernetes.io/projected/ffc7e258-78da-488a-923a-d133cc3a1d03-kube-api-access-k4rks\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.382085 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc7e258-78da-488a-923a-d133cc3a1d03-config-volume\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.388375 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc7e258-78da-488a-923a-d133cc3a1d03-secret-volume\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.396003 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zc9c\" (UniqueName: \"kubernetes.io/projected/94932c77-7581-4291-bb30-55e751a0923c-kube-api-access-7zc9c\") pod \"auto-csr-approver-29533620-79r87\" (UID: \"94932c77-7581-4291-bb30-55e751a0923c\") " pod="openshift-infra/auto-csr-approver-29533620-79r87" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.402799 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4rks\" (UniqueName: \"kubernetes.io/projected/ffc7e258-78da-488a-923a-d133cc3a1d03-kube-api-access-k4rks\") pod \"collect-profiles-29533620-8q6wq\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.464015 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533620-79r87" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.475242 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.896160 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq"] Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.945805 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533620-79r87"] Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.994554 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533620-79r87" event={"ID":"94932c77-7581-4291-bb30-55e751a0923c","Type":"ContainerStarted","Data":"deb7e2b48cba1bd3bd5d9f428d7ad0e6919c5eed5d8646abf8aeec690908c09a"} Feb 25 11:00:00 crc kubenswrapper[4725]: I0225 11:00:00.995884 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" event={"ID":"ffc7e258-78da-488a-923a-d133cc3a1d03","Type":"ContainerStarted","Data":"f58b838a59938cfebcffcd1ddfe98aeb9307abba1f5c040a92cb012f41b9d783"} Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.600025 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjxjp"] Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.600382 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjxjp" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="registry-server" containerID="cri-o://7fa51daade8fd55cabfe4a6034a853e9418f682d31aedcf824c57c0d43a1972e" gracePeriod=30 Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.611232 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2tdp"] Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.611762 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2tdp" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="registry-server" containerID="cri-o://b677fbff2f3f581bfa3558d28c0c83f69a3ec4bd8085b0eccda8263e35419e27" gracePeriod=30 Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.623629 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7624"] Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.623872 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" containerID="cri-o://7d9de28b7aa8e7741cd8b9b386367e30114fcdb10953c364e00e85a634cca82b" gracePeriod=30 Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.635144 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c8m5"] Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.635421 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6c8m5" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="registry-server" containerID="cri-o://b6aa6d0cb717b5cee60b22bc2cb6728496633a69bbe01ea2a946d586385c6f2c" gracePeriod=30 Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.648250 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k82sj"] Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.651210 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.661597 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t54lf"] Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.661867 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t54lf" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="registry-server" containerID="cri-o://ae07f7d3fb90d681f57053b60eab106d7556e64e586bba9421777633a3d0b0cc" gracePeriod=30 Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.669541 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k82sj"] Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.697979 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.698077 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.698118 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gpp\" (UniqueName: \"kubernetes.io/projected/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-kube-api-access-d9gpp\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.799386 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.799435 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gpp\" (UniqueName: \"kubernetes.io/projected/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-kube-api-access-d9gpp\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.799479 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.801263 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.808295 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:01 crc kubenswrapper[4725]: I0225 11:00:01.818355 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gpp\" (UniqueName: \"kubernetes.io/projected/d18563e5-7e1f-4e98-9419-d71fa34b9fd2-kube-api-access-d9gpp\") pod \"marketplace-operator-79b997595-k82sj\" (UID: \"d18563e5-7e1f-4e98-9419-d71fa34b9fd2\") " pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.010260 4725 generic.go:334] "Generic (PLEG): container finished" podID="ffc7e258-78da-488a-923a-d133cc3a1d03" containerID="1c2c73cb9a136828ce516615f2b242b82dbd4b8c6b7f647249caf14c33944f67" exitCode=0 Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.010307 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" event={"ID":"ffc7e258-78da-488a-923a-d133cc3a1d03","Type":"ContainerDied","Data":"1c2c73cb9a136828ce516615f2b242b82dbd4b8c6b7f647249caf14c33944f67"} Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.026035 4725 generic.go:334] "Generic (PLEG): container finished" podID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerID="b677fbff2f3f581bfa3558d28c0c83f69a3ec4bd8085b0eccda8263e35419e27" exitCode=0 Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.026100 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tdp" event={"ID":"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d","Type":"ContainerDied","Data":"b677fbff2f3f581bfa3558d28c0c83f69a3ec4bd8085b0eccda8263e35419e27"} Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.030378 4725 generic.go:334] "Generic (PLEG): container finished" podID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerID="7d9de28b7aa8e7741cd8b9b386367e30114fcdb10953c364e00e85a634cca82b" exitCode=0 Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.030446 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" event={"ID":"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474","Type":"ContainerDied","Data":"7d9de28b7aa8e7741cd8b9b386367e30114fcdb10953c364e00e85a634cca82b"} Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.030530 4725 scope.go:117] "RemoveContainer" containerID="0fba3923d377ead43ca148ced91266be82ac2d3f9cc0d9b2ed601dbe007189f5" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.032851 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjxjp" event={"ID":"8f0d98c3-7ffa-4029-ab5c-c252062b3099","Type":"ContainerDied","Data":"7fa51daade8fd55cabfe4a6034a853e9418f682d31aedcf824c57c0d43a1972e"} Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.032808 4725 generic.go:334] "Generic (PLEG): container finished" podID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerID="7fa51daade8fd55cabfe4a6034a853e9418f682d31aedcf824c57c0d43a1972e" exitCode=0 Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.032994 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjxjp" event={"ID":"8f0d98c3-7ffa-4029-ab5c-c252062b3099","Type":"ContainerDied","Data":"085fe9f2cc9986df50f2b1b381ed651a74148caa28c3f7c0c1a9fffce075d7d6"} Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.033008 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="085fe9f2cc9986df50f2b1b381ed651a74148caa28c3f7c0c1a9fffce075d7d6" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.034576 4725 generic.go:334] "Generic (PLEG): container finished" podID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerID="b6aa6d0cb717b5cee60b22bc2cb6728496633a69bbe01ea2a946d586385c6f2c" exitCode=0 Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.034636 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c8m5" event={"ID":"34091911-8e18-4a85-b0c2-a07e3c1a7e28","Type":"ContainerDied","Data":"b6aa6d0cb717b5cee60b22bc2cb6728496633a69bbe01ea2a946d586385c6f2c"} Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.035694 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.038383 4725 generic.go:334] "Generic (PLEG): container finished" podID="85249796-156c-4e21-81ee-d4cca9c8a607" containerID="ae07f7d3fb90d681f57053b60eab106d7556e64e586bba9421777633a3d0b0cc" exitCode=0 Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.038419 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t54lf" event={"ID":"85249796-156c-4e21-81ee-d4cca9c8a607","Type":"ContainerDied","Data":"ae07f7d3fb90d681f57053b60eab106d7556e64e586bba9421777633a3d0b0cc"} Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.038758 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.059514 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.070746 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.072246 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.118966 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206323 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-trusted-ca\") pod \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206382 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-operator-metrics\") pod \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206404 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7zw7\" (UniqueName: \"kubernetes.io/projected/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-kube-api-access-t7zw7\") pod \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206429 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkj27\" (UniqueName: \"kubernetes.io/projected/34091911-8e18-4a85-b0c2-a07e3c1a7e28-kube-api-access-wkj27\") pod \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206466 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-utilities\") pod \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206493 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-catalog-content\") pod \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206523 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-catalog-content\") pod \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206557 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-catalog-content\") pod \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206576 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-utilities\") pod \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\" (UID: \"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206606 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-utilities\") pod \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\" (UID: \"34091911-8e18-4a85-b0c2-a07e3c1a7e28\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206628 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl4qg\" (UniqueName: \"kubernetes.io/projected/8f0d98c3-7ffa-4029-ab5c-c252062b3099-kube-api-access-gl4qg\") pod \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\" (UID: \"8f0d98c3-7ffa-4029-ab5c-c252062b3099\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.206655 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nht5s\" (UniqueName: \"kubernetes.io/projected/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-kube-api-access-nht5s\") pod \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\" (UID: \"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.207937 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-utilities" (OuterVolumeSpecName: "utilities") pod "d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" (UID: "d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.209271 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-utilities" (OuterVolumeSpecName: "utilities") pod "34091911-8e18-4a85-b0c2-a07e3c1a7e28" (UID: "34091911-8e18-4a85-b0c2-a07e3c1a7e28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.210525 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-utilities" (OuterVolumeSpecName: "utilities") pod "8f0d98c3-7ffa-4029-ab5c-c252062b3099" (UID: "8f0d98c3-7ffa-4029-ab5c-c252062b3099"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.211183 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" (UID: "a2d2f1c0-7bd7-48d1-ab38-058b4bee2474"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.211547 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-kube-api-access-t7zw7" (OuterVolumeSpecName: "kube-api-access-t7zw7") pod "d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" (UID: "d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d"). InnerVolumeSpecName "kube-api-access-t7zw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.212658 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0d98c3-7ffa-4029-ab5c-c252062b3099-kube-api-access-gl4qg" (OuterVolumeSpecName: "kube-api-access-gl4qg") pod "8f0d98c3-7ffa-4029-ab5c-c252062b3099" (UID: "8f0d98c3-7ffa-4029-ab5c-c252062b3099"). InnerVolumeSpecName "kube-api-access-gl4qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.212679 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34091911-8e18-4a85-b0c2-a07e3c1a7e28-kube-api-access-wkj27" (OuterVolumeSpecName: "kube-api-access-wkj27") pod "34091911-8e18-4a85-b0c2-a07e3c1a7e28" (UID: "34091911-8e18-4a85-b0c2-a07e3c1a7e28"). InnerVolumeSpecName "kube-api-access-wkj27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.214163 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-kube-api-access-nht5s" (OuterVolumeSpecName: "kube-api-access-nht5s") pod "a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" (UID: "a2d2f1c0-7bd7-48d1-ab38-058b4bee2474"). InnerVolumeSpecName "kube-api-access-nht5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.214984 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" (UID: "a2d2f1c0-7bd7-48d1-ab38-058b4bee2474"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.237963 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34091911-8e18-4a85-b0c2-a07e3c1a7e28" (UID: "34091911-8e18-4a85-b0c2-a07e3c1a7e28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.267036 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k82sj"] Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.276631 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" (UID: "d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.286060 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f0d98c3-7ffa-4029-ab5c-c252062b3099" (UID: "8f0d98c3-7ffa-4029-ab5c-c252062b3099"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.307902 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-utilities\") pod \"85249796-156c-4e21-81ee-d4cca9c8a607\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.308180 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-catalog-content\") pod \"85249796-156c-4e21-81ee-d4cca9c8a607\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.308342 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk7r6\" (UniqueName: \"kubernetes.io/projected/85249796-156c-4e21-81ee-d4cca9c8a607-kube-api-access-sk7r6\") pod \"85249796-156c-4e21-81ee-d4cca9c8a607\" (UID: \"85249796-156c-4e21-81ee-d4cca9c8a607\") " Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.308697 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkj27\" (UniqueName: \"kubernetes.io/projected/34091911-8e18-4a85-b0c2-a07e3c1a7e28-kube-api-access-wkj27\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.308804 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.308908 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d98c3-7ffa-4029-ab5c-c252062b3099-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309014 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309112 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309207 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309296 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34091911-8e18-4a85-b0c2-a07e3c1a7e28-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309390 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl4qg\" (UniqueName: \"kubernetes.io/projected/8f0d98c3-7ffa-4029-ab5c-c252062b3099-kube-api-access-gl4qg\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309499 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nht5s\" (UniqueName: \"kubernetes.io/projected/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-kube-api-access-nht5s\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309586 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309678 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309769 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7zw7\" (UniqueName: \"kubernetes.io/projected/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d-kube-api-access-t7zw7\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.309732 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-utilities" (OuterVolumeSpecName: "utilities") pod "85249796-156c-4e21-81ee-d4cca9c8a607" (UID: "85249796-156c-4e21-81ee-d4cca9c8a607"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.311932 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85249796-156c-4e21-81ee-d4cca9c8a607-kube-api-access-sk7r6" (OuterVolumeSpecName: "kube-api-access-sk7r6") pod "85249796-156c-4e21-81ee-d4cca9c8a607" (UID: "85249796-156c-4e21-81ee-d4cca9c8a607"). InnerVolumeSpecName "kube-api-access-sk7r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.410772 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk7r6\" (UniqueName: \"kubernetes.io/projected/85249796-156c-4e21-81ee-d4cca9c8a607-kube-api-access-sk7r6\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.410806 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.447219 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85249796-156c-4e21-81ee-d4cca9c8a607" (UID: "85249796-156c-4e21-81ee-d4cca9c8a607"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.511813 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85249796-156c-4e21-81ee-d4cca9c8a607-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.938101 4725 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7624 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 11:00:02 crc kubenswrapper[4725]: I0225 11:00:02.938431 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.043938 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" event={"ID":"d18563e5-7e1f-4e98-9419-d71fa34b9fd2","Type":"ContainerStarted","Data":"4585122d88f072a65e181b32d5838289b311289528202a9cab4805871f2b5a9b"} Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.043977 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" event={"ID":"d18563e5-7e1f-4e98-9419-d71fa34b9fd2","Type":"ContainerStarted","Data":"c744a827d86c65be97524b3291f7aba7b7b4a07bfffc67076c464212aa9d91b1"} Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.044105 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.045655 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c8m5" event={"ID":"34091911-8e18-4a85-b0c2-a07e3c1a7e28","Type":"ContainerDied","Data":"5b639b012f956421200bdca7ca123ac62175d18af5b775aadc6c6612cb4903b8"} Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.045682 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c8m5" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.045706 4725 scope.go:117] "RemoveContainer" containerID="b6aa6d0cb717b5cee60b22bc2cb6728496633a69bbe01ea2a946d586385c6f2c" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.048562 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t54lf" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.048558 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t54lf" event={"ID":"85249796-156c-4e21-81ee-d4cca9c8a607","Type":"ContainerDied","Data":"391bc5dcdb87e8ef5d9001f5f2d5e6375d4a0c4565db01a82eb4a779e5407d9b"} Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.048914 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.050330 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tdp" event={"ID":"d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d","Type":"ContainerDied","Data":"f0175480ac8d60fb63694937c29067d5d646a5245d2f5865cdf1ea4cc93d60db"} Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.050411 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tdp" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.052883 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" event={"ID":"a2d2f1c0-7bd7-48d1-ab38-058b4bee2474","Type":"ContainerDied","Data":"99c9e06b2baa4d1223eeff0c502f98e8223eec3bb0f0407507eca3448d16e1a0"} Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.052934 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjxjp" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.052940 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7624" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.062509 4725 scope.go:117] "RemoveContainer" containerID="b820657fdd9e54d4ef0bdb02170f3e7fe45c81e5514022e13635bb86fb461be6" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.069278 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k82sj" podStartSLOduration=2.069256858 podStartE2EDuration="2.069256858s" podCreationTimestamp="2026-02-25 11:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:00:03.063034195 +0000 UTC m=+428.561616220" watchObservedRunningTime="2026-02-25 11:00:03.069256858 +0000 UTC m=+428.567838883" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.107051 4725 scope.go:117] "RemoveContainer" containerID="f85db8f74d363c09e1852d5286b16203f0dd9993771eb2931945dad3ff8edd43" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.117752 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t54lf"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.122222 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t54lf"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.132474 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7624"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.132747 4725 scope.go:117] "RemoveContainer" containerID="ae07f7d3fb90d681f57053b60eab106d7556e64e586bba9421777633a3d0b0cc" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.137549 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7624"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.145081 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c8m5"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.149231 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c8m5"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.160047 4725 scope.go:117] "RemoveContainer" containerID="b6669f7a05a7046086fc2f480ed4d3967a8ab3b212433ec6d937674d0250a200" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.176949 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2tdp"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.188247 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2tdp"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.188592 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjxjp"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.194708 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjxjp"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.197893 4725 scope.go:117] "RemoveContainer" containerID="049aa5c689e2c3873b2df13fea4bc907c104665f81898fe7524eb7a4757a5cdb" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.234450 4725 scope.go:117] "RemoveContainer" containerID="b677fbff2f3f581bfa3558d28c0c83f69a3ec4bd8085b0eccda8263e35419e27" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.237267 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" path="/var/lib/kubelet/pods/34091911-8e18-4a85-b0c2-a07e3c1a7e28/volumes" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.237988 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" path="/var/lib/kubelet/pods/85249796-156c-4e21-81ee-d4cca9c8a607/volumes" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.238664 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" path="/var/lib/kubelet/pods/8f0d98c3-7ffa-4029-ab5c-c252062b3099/volumes" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.239897 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" path="/var/lib/kubelet/pods/a2d2f1c0-7bd7-48d1-ab38-058b4bee2474/volumes" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.240444 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" path="/var/lib/kubelet/pods/d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d/volumes" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.248211 4725 scope.go:117] "RemoveContainer" containerID="32a2ff4338245e34564a739b7d146279013b13f5fa8c486eeeaf31f86f4ef8cc" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.263043 4725 scope.go:117] "RemoveContainer" containerID="c6a64853a4a31dcea88de6c448cef25fe7aa5ab333229a9d1cce19e5f6b6f030" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.276714 4725 scope.go:117] "RemoveContainer" containerID="7d9de28b7aa8e7741cd8b9b386367e30114fcdb10953c364e00e85a634cca82b" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.299157 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.423088 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc7e258-78da-488a-923a-d133cc3a1d03-config-volume\") pod \"ffc7e258-78da-488a-923a-d133cc3a1d03\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.423161 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc7e258-78da-488a-923a-d133cc3a1d03-secret-volume\") pod \"ffc7e258-78da-488a-923a-d133cc3a1d03\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.423209 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4rks\" (UniqueName: \"kubernetes.io/projected/ffc7e258-78da-488a-923a-d133cc3a1d03-kube-api-access-k4rks\") pod \"ffc7e258-78da-488a-923a-d133cc3a1d03\" (UID: \"ffc7e258-78da-488a-923a-d133cc3a1d03\") " Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.424423 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffc7e258-78da-488a-923a-d133cc3a1d03-config-volume" (OuterVolumeSpecName: "config-volume") pod "ffc7e258-78da-488a-923a-d133cc3a1d03" (UID: "ffc7e258-78da-488a-923a-d133cc3a1d03"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.427915 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc7e258-78da-488a-923a-d133cc3a1d03-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ffc7e258-78da-488a-923a-d133cc3a1d03" (UID: "ffc7e258-78da-488a-923a-d133cc3a1d03"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.430546 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc7e258-78da-488a-923a-d133cc3a1d03-kube-api-access-k4rks" (OuterVolumeSpecName: "kube-api-access-k4rks") pod "ffc7e258-78da-488a-923a-d133cc3a1d03" (UID: "ffc7e258-78da-488a-923a-d133cc3a1d03"). InnerVolumeSpecName "kube-api-access-k4rks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.524666 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc7e258-78da-488a-923a-d133cc3a1d03-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.524896 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc7e258-78da-488a-923a-d133cc3a1d03-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.525025 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4rks\" (UniqueName: \"kubernetes.io/projected/ffc7e258-78da-488a-923a-d133cc3a1d03-kube-api-access-k4rks\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.618944 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-99g6v"] Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.619558 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.619623 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.619678 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.619729 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.619783 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.619863 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.619930 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.619986 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620043 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620097 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620148 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620199 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620256 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620310 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620363 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620412 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620470 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620525 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620587 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620639 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620690 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc7e258-78da-488a-923a-d133cc3a1d03" containerName="collect-profiles" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620739 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc7e258-78da-488a-923a-d133cc3a1d03" containerName="collect-profiles" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620795 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.620883 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="extract-content" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.620957 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621009 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.621063 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621112 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: E0225 11:00:03.621161 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621282 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="extract-utilities" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621462 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc7e258-78da-488a-923a-d133cc3a1d03" containerName="collect-profiles" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621532 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621587 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="34091911-8e18-4a85-b0c2-a07e3c1a7e28" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621638 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ef9bd4-1242-4f3f-8a79-6ecfe693cd2d" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621691 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="85249796-156c-4e21-81ee-d4cca9c8a607" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621749 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d2f1c0-7bd7-48d1-ab38-058b4bee2474" containerName="marketplace-operator" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.621802 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0d98c3-7ffa-4029-ab5c-c252062b3099" containerName="registry-server" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.622488 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.624949 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.627051 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-catalog-content\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.627375 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-utilities\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.627512 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks2j8\" (UniqueName: \"kubernetes.io/projected/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-kube-api-access-ks2j8\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.630307 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99g6v"] Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.729045 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-utilities\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.729103 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks2j8\" (UniqueName: \"kubernetes.io/projected/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-kube-api-access-ks2j8\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.729160 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-catalog-content\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.729559 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-utilities\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.731573 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-catalog-content\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.746265 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks2j8\" (UniqueName: \"kubernetes.io/projected/ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a-kube-api-access-ks2j8\") pod \"redhat-operators-99g6v\" (UID: \"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a\") " pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:03 crc kubenswrapper[4725]: I0225 11:00:03.947626 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:04 crc kubenswrapper[4725]: I0225 11:00:04.069068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" event={"ID":"ffc7e258-78da-488a-923a-d133cc3a1d03","Type":"ContainerDied","Data":"f58b838a59938cfebcffcd1ddfe98aeb9307abba1f5c040a92cb012f41b9d783"} Feb 25 11:00:04 crc kubenswrapper[4725]: I0225 11:00:04.069120 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f58b838a59938cfebcffcd1ddfe98aeb9307abba1f5c040a92cb012f41b9d783" Feb 25 11:00:04 crc kubenswrapper[4725]: I0225 11:00:04.069118 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq" Feb 25 11:00:04 crc kubenswrapper[4725]: I0225 11:00:04.135713 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99g6v"] Feb 25 11:00:04 crc kubenswrapper[4725]: W0225 11:00:04.139353 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podece6c2fe_4eaa_4d6e_bb4a_2f229f45f57a.slice/crio-79b6c9f7b5a395ad42bebb5f0103314bfcc3bf4b0b4b168260d0da73f4fe9d0c WatchSource:0}: Error finding container 79b6c9f7b5a395ad42bebb5f0103314bfcc3bf4b0b4b168260d0da73f4fe9d0c: Status 404 returned error can't find the container with id 79b6c9f7b5a395ad42bebb5f0103314bfcc3bf4b0b4b168260d0da73f4fe9d0c Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.077805 4725 generic.go:334] "Generic (PLEG): container finished" podID="ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a" containerID="15a8c43ec004e4005838d20c30d9afc469e8e7343e7f9c567c1055f03f91a202" exitCode=0 Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.078177 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99g6v" event={"ID":"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a","Type":"ContainerDied","Data":"15a8c43ec004e4005838d20c30d9afc469e8e7343e7f9c567c1055f03f91a202"} Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.078215 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99g6v" event={"ID":"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a","Type":"ContainerStarted","Data":"79b6c9f7b5a395ad42bebb5f0103314bfcc3bf4b0b4b168260d0da73f4fe9d0c"} Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.082305 4725 generic.go:334] "Generic (PLEG): container finished" podID="94932c77-7581-4291-bb30-55e751a0923c" containerID="8f90bc0e02696b80ad935bdfb0994b643f22eaa198cf6ab0a2bc43b5a0e2667d" exitCode=0 Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.082529 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533620-79r87" event={"ID":"94932c77-7581-4291-bb30-55e751a0923c","Type":"ContainerDied","Data":"8f90bc0e02696b80ad935bdfb0994b643f22eaa198cf6ab0a2bc43b5a0e2667d"} Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.427171 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqq6w"] Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.428191 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.430296 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.431492 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqq6w"] Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.447274 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58eda4b-360e-4504-a3be-a409e8225852-utilities\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.447306 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58eda4b-360e-4504-a3be-a409e8225852-catalog-content\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.447335 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdht\" (UniqueName: \"kubernetes.io/projected/b58eda4b-360e-4504-a3be-a409e8225852-kube-api-access-4tdht\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.548533 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58eda4b-360e-4504-a3be-a409e8225852-utilities\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.548578 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58eda4b-360e-4504-a3be-a409e8225852-catalog-content\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.548647 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdht\" (UniqueName: \"kubernetes.io/projected/b58eda4b-360e-4504-a3be-a409e8225852-kube-api-access-4tdht\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.549157 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58eda4b-360e-4504-a3be-a409e8225852-catalog-content\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.549473 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58eda4b-360e-4504-a3be-a409e8225852-utilities\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.568296 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdht\" (UniqueName: \"kubernetes.io/projected/b58eda4b-360e-4504-a3be-a409e8225852-kube-api-access-4tdht\") pod \"certified-operators-vqq6w\" (UID: \"b58eda4b-360e-4504-a3be-a409e8225852\") " pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.747903 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:05 crc kubenswrapper[4725]: I0225 11:00:05.930413 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqq6w"] Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.030353 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-889vj"] Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.031507 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.033991 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.053505 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-889vj"] Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.089756 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq6w" event={"ID":"b58eda4b-360e-4504-a3be-a409e8225852","Type":"ContainerStarted","Data":"afba610feae6b91266e62c9a8f898c93b8d51338cc83f3cc41dfdedda9f15868"} Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.156227 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59wj9\" (UniqueName: \"kubernetes.io/projected/16868507-af62-4b1b-bf7c-317fe4e2c94e-kube-api-access-59wj9\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.156291 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-utilities\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.156352 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-catalog-content\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.257445 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59wj9\" (UniqueName: \"kubernetes.io/projected/16868507-af62-4b1b-bf7c-317fe4e2c94e-kube-api-access-59wj9\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.257527 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-utilities\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.257565 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-catalog-content\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.258079 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-utilities\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.258160 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-catalog-content\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.276837 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533620-79r87" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.278680 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59wj9\" (UniqueName: \"kubernetes.io/projected/16868507-af62-4b1b-bf7c-317fe4e2c94e-kube-api-access-59wj9\") pod \"community-operators-889vj\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.368486 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.461622 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zc9c\" (UniqueName: \"kubernetes.io/projected/94932c77-7581-4291-bb30-55e751a0923c-kube-api-access-7zc9c\") pod \"94932c77-7581-4291-bb30-55e751a0923c\" (UID: \"94932c77-7581-4291-bb30-55e751a0923c\") " Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.465737 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94932c77-7581-4291-bb30-55e751a0923c-kube-api-access-7zc9c" (OuterVolumeSpecName: "kube-api-access-7zc9c") pod "94932c77-7581-4291-bb30-55e751a0923c" (UID: "94932c77-7581-4291-bb30-55e751a0923c"). InnerVolumeSpecName "kube-api-access-7zc9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.554252 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-889vj"] Feb 25 11:00:06 crc kubenswrapper[4725]: W0225 11:00:06.560677 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16868507_af62_4b1b_bf7c_317fe4e2c94e.slice/crio-4c302fd06c12002c6de93388d1e1d55da1c88bd61781f430af30231396dce41c WatchSource:0}: Error finding container 4c302fd06c12002c6de93388d1e1d55da1c88bd61781f430af30231396dce41c: Status 404 returned error can't find the container with id 4c302fd06c12002c6de93388d1e1d55da1c88bd61781f430af30231396dce41c Feb 25 11:00:06 crc kubenswrapper[4725]: I0225 11:00:06.562866 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zc9c\" (UniqueName: \"kubernetes.io/projected/94932c77-7581-4291-bb30-55e751a0923c-kube-api-access-7zc9c\") on node \"crc\" DevicePath \"\"" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.103325 4725 generic.go:334] "Generic (PLEG): container finished" podID="b58eda4b-360e-4504-a3be-a409e8225852" containerID="cb7d7067565723bb9f3b298e19209408fb19d2b074d551962f64f6bd41efcb86" exitCode=0 Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.103437 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq6w" event={"ID":"b58eda4b-360e-4504-a3be-a409e8225852","Type":"ContainerDied","Data":"cb7d7067565723bb9f3b298e19209408fb19d2b074d551962f64f6bd41efcb86"} Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.106933 4725 generic.go:334] "Generic (PLEG): container finished" podID="ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a" containerID="72bea23c37edd41f027789b40e19a33ed7d71713c16efe780d3cc3c18790503a" exitCode=0 Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.107669 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99g6v" event={"ID":"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a","Type":"ContainerDied","Data":"72bea23c37edd41f027789b40e19a33ed7d71713c16efe780d3cc3c18790503a"} Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.109096 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533620-79r87" event={"ID":"94932c77-7581-4291-bb30-55e751a0923c","Type":"ContainerDied","Data":"deb7e2b48cba1bd3bd5d9f428d7ad0e6919c5eed5d8646abf8aeec690908c09a"} Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.109145 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb7e2b48cba1bd3bd5d9f428d7ad0e6919c5eed5d8646abf8aeec690908c09a" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.109117 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533620-79r87" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.113388 4725 generic.go:334] "Generic (PLEG): container finished" podID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerID="e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab" exitCode=0 Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.113429 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889vj" event={"ID":"16868507-af62-4b1b-bf7c-317fe4e2c94e","Type":"ContainerDied","Data":"e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab"} Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.113455 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889vj" event={"ID":"16868507-af62-4b1b-bf7c-317fe4e2c94e","Type":"ContainerStarted","Data":"4c302fd06c12002c6de93388d1e1d55da1c88bd61781f430af30231396dce41c"} Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.820728 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g7f9h"] Feb 25 11:00:07 crc kubenswrapper[4725]: E0225 11:00:07.821286 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94932c77-7581-4291-bb30-55e751a0923c" containerName="oc" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.821300 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="94932c77-7581-4291-bb30-55e751a0923c" containerName="oc" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.821388 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="94932c77-7581-4291-bb30-55e751a0923c" containerName="oc" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.822033 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.825539 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.837279 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7f9h"] Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.978660 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29k54\" (UniqueName: \"kubernetes.io/projected/15f43ce2-181a-480f-9ea5-c608d2d414c4-kube-api-access-29k54\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.978794 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f43ce2-181a-480f-9ea5-c608d2d414c4-utilities\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:07 crc kubenswrapper[4725]: I0225 11:00:07.978885 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f43ce2-181a-480f-9ea5-c608d2d414c4-catalog-content\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.079709 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f43ce2-181a-480f-9ea5-c608d2d414c4-catalog-content\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.079853 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29k54\" (UniqueName: \"kubernetes.io/projected/15f43ce2-181a-480f-9ea5-c608d2d414c4-kube-api-access-29k54\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.079905 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f43ce2-181a-480f-9ea5-c608d2d414c4-utilities\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.080367 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f43ce2-181a-480f-9ea5-c608d2d414c4-catalog-content\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.080375 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f43ce2-181a-480f-9ea5-c608d2d414c4-utilities\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.103681 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29k54\" (UniqueName: \"kubernetes.io/projected/15f43ce2-181a-480f-9ea5-c608d2d414c4-kube-api-access-29k54\") pod \"redhat-marketplace-g7f9h\" (UID: \"15f43ce2-181a-480f-9ea5-c608d2d414c4\") " pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.119468 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99g6v" event={"ID":"ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a","Type":"ContainerStarted","Data":"780a1ece836092bb52c83368da0a850f1c49bffa03a965db9d6845446fb0fe02"} Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.147340 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.347061 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-99g6v" podStartSLOduration=2.868803861 podStartE2EDuration="5.347042961s" podCreationTimestamp="2026-02-25 11:00:03 +0000 UTC" firstStartedPulling="2026-02-25 11:00:05.079681806 +0000 UTC m=+430.578263861" lastFinishedPulling="2026-02-25 11:00:07.557920936 +0000 UTC m=+433.056502961" observedRunningTime="2026-02-25 11:00:08.147684524 +0000 UTC m=+433.646266619" watchObservedRunningTime="2026-02-25 11:00:08.347042961 +0000 UTC m=+433.845624986" Feb 25 11:00:08 crc kubenswrapper[4725]: I0225 11:00:08.350040 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7f9h"] Feb 25 11:00:08 crc kubenswrapper[4725]: W0225 11:00:08.358235 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f43ce2_181a_480f_9ea5_c608d2d414c4.slice/crio-9aa1e9afcd589094151f96d1d90a8671b2d843f03cf791db1269e85393e56edc WatchSource:0}: Error finding container 9aa1e9afcd589094151f96d1d90a8671b2d843f03cf791db1269e85393e56edc: Status 404 returned error can't find the container with id 9aa1e9afcd589094151f96d1d90a8671b2d843f03cf791db1269e85393e56edc Feb 25 11:00:09 crc kubenswrapper[4725]: I0225 11:00:09.125666 4725 generic.go:334] "Generic (PLEG): container finished" podID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerID="063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b" exitCode=0 Feb 25 11:00:09 crc kubenswrapper[4725]: I0225 11:00:09.125728 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889vj" event={"ID":"16868507-af62-4b1b-bf7c-317fe4e2c94e","Type":"ContainerDied","Data":"063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b"} Feb 25 11:00:09 crc kubenswrapper[4725]: I0225 11:00:09.128694 4725 generic.go:334] "Generic (PLEG): container finished" podID="b58eda4b-360e-4504-a3be-a409e8225852" containerID="fa747df36153b5a5e6e3edea57997c14cfae195281c623e126708b282aec49c3" exitCode=0 Feb 25 11:00:09 crc kubenswrapper[4725]: I0225 11:00:09.128785 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq6w" event={"ID":"b58eda4b-360e-4504-a3be-a409e8225852","Type":"ContainerDied","Data":"fa747df36153b5a5e6e3edea57997c14cfae195281c623e126708b282aec49c3"} Feb 25 11:00:09 crc kubenswrapper[4725]: I0225 11:00:09.131400 4725 generic.go:334] "Generic (PLEG): container finished" podID="15f43ce2-181a-480f-9ea5-c608d2d414c4" containerID="5f0ac53a33b3d1213575cc71425d58916dbbd53976bbf6c253ff46cc6b5c3f99" exitCode=0 Feb 25 11:00:09 crc kubenswrapper[4725]: I0225 11:00:09.131481 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7f9h" event={"ID":"15f43ce2-181a-480f-9ea5-c608d2d414c4","Type":"ContainerDied","Data":"5f0ac53a33b3d1213575cc71425d58916dbbd53976bbf6c253ff46cc6b5c3f99"} Feb 25 11:00:09 crc kubenswrapper[4725]: I0225 11:00:09.131513 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7f9h" event={"ID":"15f43ce2-181a-480f-9ea5-c608d2d414c4","Type":"ContainerStarted","Data":"9aa1e9afcd589094151f96d1d90a8671b2d843f03cf791db1269e85393e56edc"} Feb 25 11:00:10 crc kubenswrapper[4725]: I0225 11:00:10.137998 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqq6w" event={"ID":"b58eda4b-360e-4504-a3be-a409e8225852","Type":"ContainerStarted","Data":"a7207fcf8613ac9bef3b07857946b827c86e6c9deeddcff7a3980ec89ee0647d"} Feb 25 11:00:10 crc kubenswrapper[4725]: I0225 11:00:10.139179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7f9h" event={"ID":"15f43ce2-181a-480f-9ea5-c608d2d414c4","Type":"ContainerStarted","Data":"31478d9e3a66d158550d0c69087523591295f8d8adaac9c30667206b889443d6"} Feb 25 11:00:10 crc kubenswrapper[4725]: I0225 11:00:10.141036 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889vj" event={"ID":"16868507-af62-4b1b-bf7c-317fe4e2c94e","Type":"ContainerStarted","Data":"aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841"} Feb 25 11:00:10 crc kubenswrapper[4725]: I0225 11:00:10.165130 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqq6w" podStartSLOduration=2.492229903 podStartE2EDuration="5.165106828s" podCreationTimestamp="2026-02-25 11:00:05 +0000 UTC" firstStartedPulling="2026-02-25 11:00:07.105133424 +0000 UTC m=+432.603715449" lastFinishedPulling="2026-02-25 11:00:09.778010349 +0000 UTC m=+435.276592374" observedRunningTime="2026-02-25 11:00:10.161222629 +0000 UTC m=+435.659804664" watchObservedRunningTime="2026-02-25 11:00:10.165106828 +0000 UTC m=+435.663688873" Feb 25 11:00:10 crc kubenswrapper[4725]: I0225 11:00:10.186206 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-889vj" podStartSLOduration=1.708172292 podStartE2EDuration="4.186190087s" podCreationTimestamp="2026-02-25 11:00:06 +0000 UTC" firstStartedPulling="2026-02-25 11:00:07.114559866 +0000 UTC m=+432.613141891" lastFinishedPulling="2026-02-25 11:00:09.592577661 +0000 UTC m=+435.091159686" observedRunningTime="2026-02-25 11:00:10.181626159 +0000 UTC m=+435.680208214" watchObservedRunningTime="2026-02-25 11:00:10.186190087 +0000 UTC m=+435.684772112" Feb 25 11:00:11 crc kubenswrapper[4725]: I0225 11:00:11.175902 4725 generic.go:334] "Generic (PLEG): container finished" podID="15f43ce2-181a-480f-9ea5-c608d2d414c4" containerID="31478d9e3a66d158550d0c69087523591295f8d8adaac9c30667206b889443d6" exitCode=0 Feb 25 11:00:11 crc kubenswrapper[4725]: I0225 11:00:11.175972 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7f9h" event={"ID":"15f43ce2-181a-480f-9ea5-c608d2d414c4","Type":"ContainerDied","Data":"31478d9e3a66d158550d0c69087523591295f8d8adaac9c30667206b889443d6"} Feb 25 11:00:11 crc kubenswrapper[4725]: I0225 11:00:11.555318 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:00:11 crc kubenswrapper[4725]: I0225 11:00:11.555384 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:00:13 crc kubenswrapper[4725]: I0225 11:00:13.948100 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:13 crc kubenswrapper[4725]: I0225 11:00:13.948682 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:13 crc kubenswrapper[4725]: I0225 11:00:13.992984 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:14 crc kubenswrapper[4725]: I0225 11:00:14.192299 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7f9h" event={"ID":"15f43ce2-181a-480f-9ea5-c608d2d414c4","Type":"ContainerStarted","Data":"62e0da6b59888c48cc39eea3524715497eb1378ff39eba4f4456c21a0d21c59c"} Feb 25 11:00:14 crc kubenswrapper[4725]: I0225 11:00:14.212281 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g7f9h" podStartSLOduration=4.220481327 podStartE2EDuration="7.212266878s" podCreationTimestamp="2026-02-25 11:00:07 +0000 UTC" firstStartedPulling="2026-02-25 11:00:09.132988547 +0000 UTC m=+434.631570572" lastFinishedPulling="2026-02-25 11:00:12.124774098 +0000 UTC m=+437.623356123" observedRunningTime="2026-02-25 11:00:14.211349232 +0000 UTC m=+439.709931327" watchObservedRunningTime="2026-02-25 11:00:14.212266878 +0000 UTC m=+439.710848903" Feb 25 11:00:14 crc kubenswrapper[4725]: I0225 11:00:14.240738 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-99g6v" Feb 25 11:00:15 crc kubenswrapper[4725]: I0225 11:00:15.749109 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:15 crc kubenswrapper[4725]: I0225 11:00:15.749165 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:15 crc kubenswrapper[4725]: I0225 11:00:15.787214 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.219313 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gqcf"] Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.220093 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.235289 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gqcf"] Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.274976 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqq6w" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.368613 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.368662 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.390898 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcbd\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-kube-api-access-5gcbd\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.390987 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/920f272c-7229-4b9b-b7cb-0a2836a927ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.392301 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920f272c-7229-4b9b-b7cb-0a2836a927ec-trusted-ca\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.392375 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/920f272c-7229-4b9b-b7cb-0a2836a927ec-registry-certificates\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.392426 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/920f272c-7229-4b9b-b7cb-0a2836a927ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.392532 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.392623 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-registry-tls\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.392685 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-bound-sa-token\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.431517 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.436636 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.493495 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcbd\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-kube-api-access-5gcbd\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.493553 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/920f272c-7229-4b9b-b7cb-0a2836a927ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.493597 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920f272c-7229-4b9b-b7cb-0a2836a927ec-trusted-ca\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.493613 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/920f272c-7229-4b9b-b7cb-0a2836a927ec-registry-certificates\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.493626 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/920f272c-7229-4b9b-b7cb-0a2836a927ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.493687 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-registry-tls\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.493712 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-bound-sa-token\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.494607 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/920f272c-7229-4b9b-b7cb-0a2836a927ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.495736 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/920f272c-7229-4b9b-b7cb-0a2836a927ec-registry-certificates\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.495742 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920f272c-7229-4b9b-b7cb-0a2836a927ec-trusted-ca\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.499618 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-registry-tls\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.501308 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/920f272c-7229-4b9b-b7cb-0a2836a927ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.512762 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-bound-sa-token\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.513449 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcbd\" (UniqueName: \"kubernetes.io/projected/920f272c-7229-4b9b-b7cb-0a2836a927ec-kube-api-access-5gcbd\") pod \"image-registry-66df7c8f76-7gqcf\" (UID: \"920f272c-7229-4b9b-b7cb-0a2836a927ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.550445 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:16 crc kubenswrapper[4725]: I0225 11:00:16.766717 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gqcf"] Feb 25 11:00:17 crc kubenswrapper[4725]: I0225 11:00:17.209421 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" event={"ID":"920f272c-7229-4b9b-b7cb-0a2836a927ec","Type":"ContainerStarted","Data":"628852aa66193e71b6490bfe8b57358951ab38c68e4db04b06733c2e890875c1"} Feb 25 11:00:17 crc kubenswrapper[4725]: I0225 11:00:17.209854 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" event={"ID":"920f272c-7229-4b9b-b7cb-0a2836a927ec","Type":"ContainerStarted","Data":"d69bf585fc1f84616bfb85cab2423684429af088aa54aa462f728a56af2ce4ea"} Feb 25 11:00:17 crc kubenswrapper[4725]: I0225 11:00:17.210222 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:17 crc kubenswrapper[4725]: I0225 11:00:17.265524 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:00:17 crc kubenswrapper[4725]: I0225 11:00:17.297116 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" podStartSLOduration=1.297086486 podStartE2EDuration="1.297086486s" podCreationTimestamp="2026-02-25 11:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:00:17.241401421 +0000 UTC m=+442.739983466" watchObservedRunningTime="2026-02-25 11:00:17.297086486 +0000 UTC m=+442.795668531" Feb 25 11:00:18 crc kubenswrapper[4725]: I0225 11:00:18.148513 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:18 crc kubenswrapper[4725]: I0225 11:00:18.148959 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:18 crc kubenswrapper[4725]: I0225 11:00:18.199029 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:18 crc kubenswrapper[4725]: I0225 11:00:18.267508 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g7f9h" Feb 25 11:00:36 crc kubenswrapper[4725]: I0225 11:00:36.556893 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7gqcf" Feb 25 11:00:36 crc kubenswrapper[4725]: I0225 11:00:36.621210 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpmr4"] Feb 25 11:00:41 crc kubenswrapper[4725]: I0225 11:00:41.555976 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:00:41 crc kubenswrapper[4725]: I0225 11:00:41.558262 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:01:01 crc kubenswrapper[4725]: I0225 11:01:01.685371 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" podUID="f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" containerName="registry" containerID="cri-o://8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7" gracePeriod=30 Feb 25 11:01:01 crc kubenswrapper[4725]: I0225 11:01:01.985733 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.091507 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-ca-trust-extracted\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.091579 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-certificates\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.091695 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-bound-sa-token\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.091725 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfpnd\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-kube-api-access-xfpnd\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.091746 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-installation-pull-secrets\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.091976 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.092038 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-tls\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.092059 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-trusted-ca\") pod \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\" (UID: \"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519\") " Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.092439 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.092779 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.092860 4725 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.097112 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.097113 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.098390 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-kube-api-access-xfpnd" (OuterVolumeSpecName: "kube-api-access-xfpnd") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "kube-api-access-xfpnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.098577 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.103313 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.107079 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" (UID: "f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.193547 4725 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.193578 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.193588 4725 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.193598 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfpnd\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-kube-api-access-xfpnd\") on node \"crc\" DevicePath \"\"" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.193608 4725 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.193616 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.456603 4725 generic.go:334] "Generic (PLEG): container finished" podID="f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" containerID="8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7" exitCode=0 Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.456656 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.456651 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" event={"ID":"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519","Type":"ContainerDied","Data":"8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7"} Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.456838 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dpmr4" event={"ID":"f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519","Type":"ContainerDied","Data":"ae4fad6bd8a056d20271b3d52d84f9c0a55b466bc54caaa65f0302d7bfc3dbb9"} Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.456858 4725 scope.go:117] "RemoveContainer" containerID="8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.470910 4725 scope.go:117] "RemoveContainer" containerID="8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7" Feb 25 11:01:02 crc kubenswrapper[4725]: E0225 11:01:02.471274 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7\": container with ID starting with 8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7 not found: ID does not exist" containerID="8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.471302 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7"} err="failed to get container status \"8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7\": rpc error: code = NotFound desc = could not find container \"8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7\": container with ID starting with 8a203f0f7b0c510c44f5196a1052fb7405cfcc4c4ff46a783f49b0dc5b892ee7 not found: ID does not exist" Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.483709 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpmr4"] Feb 25 11:01:02 crc kubenswrapper[4725]: I0225 11:01:02.487440 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dpmr4"] Feb 25 11:01:03 crc kubenswrapper[4725]: I0225 11:01:03.230786 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" path="/var/lib/kubelet/pods/f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519/volumes" Feb 25 11:01:11 crc kubenswrapper[4725]: I0225 11:01:11.555507 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:01:11 crc kubenswrapper[4725]: I0225 11:01:11.556197 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:01:11 crc kubenswrapper[4725]: I0225 11:01:11.556266 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:01:11 crc kubenswrapper[4725]: I0225 11:01:11.557062 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"157d59ad935f84209be35b0af551db160cdd330b6d67a7efbf73190929b1d6c7"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:01:11 crc kubenswrapper[4725]: I0225 11:01:11.557145 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://157d59ad935f84209be35b0af551db160cdd330b6d67a7efbf73190929b1d6c7" gracePeriod=600 Feb 25 11:01:12 crc kubenswrapper[4725]: I0225 11:01:12.519964 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="157d59ad935f84209be35b0af551db160cdd330b6d67a7efbf73190929b1d6c7" exitCode=0 Feb 25 11:01:12 crc kubenswrapper[4725]: I0225 11:01:12.520043 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"157d59ad935f84209be35b0af551db160cdd330b6d67a7efbf73190929b1d6c7"} Feb 25 11:01:12 crc kubenswrapper[4725]: I0225 11:01:12.520849 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"08647c57662156eda0794f315db9e612765b561e546985866febc0fd340a1ac9"} Feb 25 11:01:12 crc kubenswrapper[4725]: I0225 11:01:12.520878 4725 scope.go:117] "RemoveContainer" containerID="81ff02c82e1a11e0d43cd3f0b17c7d9e42449f7a49d493deefb8ab23d2e467e2" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.130148 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533622-82gpd"] Feb 25 11:02:00 crc kubenswrapper[4725]: E0225 11:02:00.130969 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" containerName="registry" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.130985 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" containerName="registry" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.131105 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d5eb9c-abf6-4d9c-ba1f-4a78324d5519" containerName="registry" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.131525 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533622-82gpd" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.135546 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.135887 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.136030 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.139890 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533622-82gpd"] Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.206555 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6nw2\" (UniqueName: \"kubernetes.io/projected/a880511c-5d78-4ce1-8c43-ecefd558e91c-kube-api-access-m6nw2\") pod \"auto-csr-approver-29533622-82gpd\" (UID: \"a880511c-5d78-4ce1-8c43-ecefd558e91c\") " pod="openshift-infra/auto-csr-approver-29533622-82gpd" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.307349 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6nw2\" (UniqueName: \"kubernetes.io/projected/a880511c-5d78-4ce1-8c43-ecefd558e91c-kube-api-access-m6nw2\") pod \"auto-csr-approver-29533622-82gpd\" (UID: \"a880511c-5d78-4ce1-8c43-ecefd558e91c\") " pod="openshift-infra/auto-csr-approver-29533622-82gpd" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.326897 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6nw2\" (UniqueName: \"kubernetes.io/projected/a880511c-5d78-4ce1-8c43-ecefd558e91c-kube-api-access-m6nw2\") pod \"auto-csr-approver-29533622-82gpd\" (UID: \"a880511c-5d78-4ce1-8c43-ecefd558e91c\") " pod="openshift-infra/auto-csr-approver-29533622-82gpd" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.451650 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533622-82gpd" Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.842201 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533622-82gpd"] Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.849374 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:02:00 crc kubenswrapper[4725]: I0225 11:02:00.952673 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533622-82gpd" event={"ID":"a880511c-5d78-4ce1-8c43-ecefd558e91c","Type":"ContainerStarted","Data":"7ec6a8b2af4243334b4ecd58ab46a3989bc83783f24a42e03398659669d8a3c5"} Feb 25 11:02:01 crc kubenswrapper[4725]: I0225 11:02:01.959926 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533622-82gpd" event={"ID":"a880511c-5d78-4ce1-8c43-ecefd558e91c","Type":"ContainerStarted","Data":"62ceca1f770affcb8a1fdea855d8224741bdd2ad16b49b27ffe2b8f0983b396e"} Feb 25 11:02:01 crc kubenswrapper[4725]: I0225 11:02:01.974519 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533622-82gpd" podStartSLOduration=1.131867626 podStartE2EDuration="1.974491204s" podCreationTimestamp="2026-02-25 11:02:00 +0000 UTC" firstStartedPulling="2026-02-25 11:02:00.849175682 +0000 UTC m=+546.347757707" lastFinishedPulling="2026-02-25 11:02:01.69179927 +0000 UTC m=+547.190381285" observedRunningTime="2026-02-25 11:02:01.972268962 +0000 UTC m=+547.470851007" watchObservedRunningTime="2026-02-25 11:02:01.974491204 +0000 UTC m=+547.473073229" Feb 25 11:02:02 crc kubenswrapper[4725]: I0225 11:02:02.966188 4725 generic.go:334] "Generic (PLEG): container finished" podID="a880511c-5d78-4ce1-8c43-ecefd558e91c" containerID="62ceca1f770affcb8a1fdea855d8224741bdd2ad16b49b27ffe2b8f0983b396e" exitCode=0 Feb 25 11:02:02 crc kubenswrapper[4725]: I0225 11:02:02.966229 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533622-82gpd" event={"ID":"a880511c-5d78-4ce1-8c43-ecefd558e91c","Type":"ContainerDied","Data":"62ceca1f770affcb8a1fdea855d8224741bdd2ad16b49b27ffe2b8f0983b396e"} Feb 25 11:02:04 crc kubenswrapper[4725]: I0225 11:02:04.168895 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533622-82gpd" Feb 25 11:02:04 crc kubenswrapper[4725]: I0225 11:02:04.352014 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6nw2\" (UniqueName: \"kubernetes.io/projected/a880511c-5d78-4ce1-8c43-ecefd558e91c-kube-api-access-m6nw2\") pod \"a880511c-5d78-4ce1-8c43-ecefd558e91c\" (UID: \"a880511c-5d78-4ce1-8c43-ecefd558e91c\") " Feb 25 11:02:04 crc kubenswrapper[4725]: I0225 11:02:04.362565 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a880511c-5d78-4ce1-8c43-ecefd558e91c-kube-api-access-m6nw2" (OuterVolumeSpecName: "kube-api-access-m6nw2") pod "a880511c-5d78-4ce1-8c43-ecefd558e91c" (UID: "a880511c-5d78-4ce1-8c43-ecefd558e91c"). InnerVolumeSpecName "kube-api-access-m6nw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:02:04 crc kubenswrapper[4725]: I0225 11:02:04.453639 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6nw2\" (UniqueName: \"kubernetes.io/projected/a880511c-5d78-4ce1-8c43-ecefd558e91c-kube-api-access-m6nw2\") on node \"crc\" DevicePath \"\"" Feb 25 11:02:04 crc kubenswrapper[4725]: I0225 11:02:04.981306 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533622-82gpd" event={"ID":"a880511c-5d78-4ce1-8c43-ecefd558e91c","Type":"ContainerDied","Data":"7ec6a8b2af4243334b4ecd58ab46a3989bc83783f24a42e03398659669d8a3c5"} Feb 25 11:02:04 crc kubenswrapper[4725]: I0225 11:02:04.981349 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ec6a8b2af4243334b4ecd58ab46a3989bc83783f24a42e03398659669d8a3c5" Feb 25 11:02:04 crc kubenswrapper[4725]: I0225 11:02:04.981366 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533622-82gpd" Feb 25 11:02:05 crc kubenswrapper[4725]: I0225 11:02:05.025668 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533616-zsh9g"] Feb 25 11:02:05 crc kubenswrapper[4725]: I0225 11:02:05.030042 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533616-zsh9g"] Feb 25 11:02:05 crc kubenswrapper[4725]: I0225 11:02:05.234256 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b17a01-64f4-4578-9e56-19825cfa713f" path="/var/lib/kubelet/pods/b0b17a01-64f4-4578-9e56-19825cfa713f/volumes" Feb 25 11:03:11 crc kubenswrapper[4725]: I0225 11:03:11.556177 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:03:11 crc kubenswrapper[4725]: I0225 11:03:11.557247 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:03:14 crc kubenswrapper[4725]: I0225 11:03:14.733043 4725 scope.go:117] "RemoveContainer" containerID="92893cf08b177659fe7f7f5c0824254848767f650b0b5de5b00bf6bebadc7ef4" Feb 25 11:03:41 crc kubenswrapper[4725]: I0225 11:03:41.555678 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:03:41 crc kubenswrapper[4725]: I0225 11:03:41.556401 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.155632 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533624-r2kgf"] Feb 25 11:04:00 crc kubenswrapper[4725]: E0225 11:04:00.156703 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a880511c-5d78-4ce1-8c43-ecefd558e91c" containerName="oc" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.156783 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a880511c-5d78-4ce1-8c43-ecefd558e91c" containerName="oc" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.157141 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a880511c-5d78-4ce1-8c43-ecefd558e91c" containerName="oc" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.158065 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533624-r2kgf"] Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.158252 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533624-r2kgf" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.200133 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.200157 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.200471 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.300652 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksbrm\" (UniqueName: \"kubernetes.io/projected/ac3f5247-3533-4360-a134-2c2d24332e5f-kube-api-access-ksbrm\") pod \"auto-csr-approver-29533624-r2kgf\" (UID: \"ac3f5247-3533-4360-a134-2c2d24332e5f\") " pod="openshift-infra/auto-csr-approver-29533624-r2kgf" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.402338 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksbrm\" (UniqueName: \"kubernetes.io/projected/ac3f5247-3533-4360-a134-2c2d24332e5f-kube-api-access-ksbrm\") pod \"auto-csr-approver-29533624-r2kgf\" (UID: \"ac3f5247-3533-4360-a134-2c2d24332e5f\") " pod="openshift-infra/auto-csr-approver-29533624-r2kgf" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.434443 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksbrm\" (UniqueName: \"kubernetes.io/projected/ac3f5247-3533-4360-a134-2c2d24332e5f-kube-api-access-ksbrm\") pod \"auto-csr-approver-29533624-r2kgf\" (UID: \"ac3f5247-3533-4360-a134-2c2d24332e5f\") " pod="openshift-infra/auto-csr-approver-29533624-r2kgf" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.516672 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533624-r2kgf" Feb 25 11:04:00 crc kubenswrapper[4725]: I0225 11:04:00.730249 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533624-r2kgf"] Feb 25 11:04:01 crc kubenswrapper[4725]: I0225 11:04:01.706551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533624-r2kgf" event={"ID":"ac3f5247-3533-4360-a134-2c2d24332e5f","Type":"ContainerStarted","Data":"ba1f48b5c650628ea79f1c6d14b217b6184b7c96cd0bd70e08183f7104bf45da"} Feb 25 11:04:02 crc kubenswrapper[4725]: I0225 11:04:02.716346 4725 generic.go:334] "Generic (PLEG): container finished" podID="ac3f5247-3533-4360-a134-2c2d24332e5f" containerID="eb296869dd57a44ec8e543377f62d34b7303f1b82deb6c47732be71305eb5f20" exitCode=0 Feb 25 11:04:02 crc kubenswrapper[4725]: I0225 11:04:02.716441 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533624-r2kgf" event={"ID":"ac3f5247-3533-4360-a134-2c2d24332e5f","Type":"ContainerDied","Data":"eb296869dd57a44ec8e543377f62d34b7303f1b82deb6c47732be71305eb5f20"} Feb 25 11:04:03 crc kubenswrapper[4725]: I0225 11:04:03.961757 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533624-r2kgf" Feb 25 11:04:04 crc kubenswrapper[4725]: I0225 11:04:04.148477 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksbrm\" (UniqueName: \"kubernetes.io/projected/ac3f5247-3533-4360-a134-2c2d24332e5f-kube-api-access-ksbrm\") pod \"ac3f5247-3533-4360-a134-2c2d24332e5f\" (UID: \"ac3f5247-3533-4360-a134-2c2d24332e5f\") " Feb 25 11:04:04 crc kubenswrapper[4725]: I0225 11:04:04.153570 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3f5247-3533-4360-a134-2c2d24332e5f-kube-api-access-ksbrm" (OuterVolumeSpecName: "kube-api-access-ksbrm") pod "ac3f5247-3533-4360-a134-2c2d24332e5f" (UID: "ac3f5247-3533-4360-a134-2c2d24332e5f"). InnerVolumeSpecName "kube-api-access-ksbrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:04:04 crc kubenswrapper[4725]: I0225 11:04:04.249885 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksbrm\" (UniqueName: \"kubernetes.io/projected/ac3f5247-3533-4360-a134-2c2d24332e5f-kube-api-access-ksbrm\") on node \"crc\" DevicePath \"\"" Feb 25 11:04:04 crc kubenswrapper[4725]: I0225 11:04:04.741439 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533624-r2kgf" event={"ID":"ac3f5247-3533-4360-a134-2c2d24332e5f","Type":"ContainerDied","Data":"ba1f48b5c650628ea79f1c6d14b217b6184b7c96cd0bd70e08183f7104bf45da"} Feb 25 11:04:04 crc kubenswrapper[4725]: I0225 11:04:04.741495 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1f48b5c650628ea79f1c6d14b217b6184b7c96cd0bd70e08183f7104bf45da" Feb 25 11:04:04 crc kubenswrapper[4725]: I0225 11:04:04.741581 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533624-r2kgf" Feb 25 11:04:05 crc kubenswrapper[4725]: I0225 11:04:05.036798 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533618-tbgdb"] Feb 25 11:04:05 crc kubenswrapper[4725]: I0225 11:04:05.043240 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533618-tbgdb"] Feb 25 11:04:05 crc kubenswrapper[4725]: I0225 11:04:05.230709 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c413df5-7174-492a-8ab4-314e9be6bf83" path="/var/lib/kubelet/pods/1c413df5-7174-492a-8ab4-314e9be6bf83/volumes" Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.555623 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.557058 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.557167 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.558714 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08647c57662156eda0794f315db9e612765b561e546985866febc0fd340a1ac9"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.558811 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://08647c57662156eda0794f315db9e612765b561e546985866febc0fd340a1ac9" gracePeriod=600 Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.808007 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="08647c57662156eda0794f315db9e612765b561e546985866febc0fd340a1ac9" exitCode=0 Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.808070 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"08647c57662156eda0794f315db9e612765b561e546985866febc0fd340a1ac9"} Feb 25 11:04:11 crc kubenswrapper[4725]: I0225 11:04:11.808773 4725 scope.go:117] "RemoveContainer" containerID="157d59ad935f84209be35b0af551db160cdd330b6d67a7efbf73190929b1d6c7" Feb 25 11:04:12 crc kubenswrapper[4725]: I0225 11:04:12.819676 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"976e63b74d2c07989af044494938e1fa71027bc94145eac91a1d7ca390924f15"} Feb 25 11:04:14 crc kubenswrapper[4725]: I0225 11:04:14.770487 4725 scope.go:117] "RemoveContainer" containerID="0891e915e9feff026f54347fefc15acd2e37f71aa7cc2b70deef1316d7f8786b" Feb 25 11:04:14 crc kubenswrapper[4725]: I0225 11:04:14.804494 4725 scope.go:117] "RemoveContainer" containerID="8e88f37ae88e7ecaf1325e36a47cb9c7db6b1b4d75901f1ba04e427c12f0b843" Feb 25 11:04:14 crc kubenswrapper[4725]: I0225 11:04:14.823781 4725 scope.go:117] "RemoveContainer" containerID="7fa51daade8fd55cabfe4a6034a853e9418f682d31aedcf824c57c0d43a1972e" Feb 25 11:04:14 crc kubenswrapper[4725]: I0225 11:04:14.846706 4725 scope.go:117] "RemoveContainer" containerID="c3291a321087990616222f8b5a2cd186793e4e37830a806da78e2f830bac57b7" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.306679 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb"] Feb 25 11:05:04 crc kubenswrapper[4725]: E0225 11:05:04.307646 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3f5247-3533-4360-a134-2c2d24332e5f" containerName="oc" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.307668 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3f5247-3533-4360-a134-2c2d24332e5f" containerName="oc" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.308030 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3f5247-3533-4360-a134-2c2d24332e5f" containerName="oc" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.308584 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.311511 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.311980 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.315474 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-k9jtf"] Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.316195 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-k9jtf" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.316779 4725 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-gd7wj" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.320385 4725 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qfxfd" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.324664 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6wnjw"] Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.325744 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.328413 4725 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-k49vz" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.328874 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb"] Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.334922 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6wnjw"] Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.348126 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-k9jtf"] Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.381471 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4kx\" (UniqueName: \"kubernetes.io/projected/899495ee-adcf-4350-a1b3-6a3cdd8c9d42-kube-api-access-vt4kx\") pod \"cert-manager-webhook-687f57d79b-6wnjw\" (UID: \"899495ee-adcf-4350-a1b3-6a3cdd8c9d42\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.381544 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbldf\" (UniqueName: \"kubernetes.io/projected/32a638a3-425e-4564-b5e1-b11c3d332ed6-kube-api-access-jbldf\") pod \"cert-manager-858654f9db-k9jtf\" (UID: \"32a638a3-425e-4564-b5e1-b11c3d332ed6\") " pod="cert-manager/cert-manager-858654f9db-k9jtf" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.381596 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6wj\" (UniqueName: \"kubernetes.io/projected/5f1f7118-2524-4653-9a60-82142d16ef44-kube-api-access-sf6wj\") pod \"cert-manager-cainjector-cf98fcc89-6xsxb\" (UID: \"5f1f7118-2524-4653-9a60-82142d16ef44\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.482721 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4kx\" (UniqueName: \"kubernetes.io/projected/899495ee-adcf-4350-a1b3-6a3cdd8c9d42-kube-api-access-vt4kx\") pod \"cert-manager-webhook-687f57d79b-6wnjw\" (UID: \"899495ee-adcf-4350-a1b3-6a3cdd8c9d42\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.482894 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbldf\" (UniqueName: \"kubernetes.io/projected/32a638a3-425e-4564-b5e1-b11c3d332ed6-kube-api-access-jbldf\") pod \"cert-manager-858654f9db-k9jtf\" (UID: \"32a638a3-425e-4564-b5e1-b11c3d332ed6\") " pod="cert-manager/cert-manager-858654f9db-k9jtf" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.483012 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6wj\" (UniqueName: \"kubernetes.io/projected/5f1f7118-2524-4653-9a60-82142d16ef44-kube-api-access-sf6wj\") pod \"cert-manager-cainjector-cf98fcc89-6xsxb\" (UID: \"5f1f7118-2524-4653-9a60-82142d16ef44\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.502923 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbldf\" (UniqueName: \"kubernetes.io/projected/32a638a3-425e-4564-b5e1-b11c3d332ed6-kube-api-access-jbldf\") pod \"cert-manager-858654f9db-k9jtf\" (UID: \"32a638a3-425e-4564-b5e1-b11c3d332ed6\") " pod="cert-manager/cert-manager-858654f9db-k9jtf" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.503936 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6wj\" (UniqueName: \"kubernetes.io/projected/5f1f7118-2524-4653-9a60-82142d16ef44-kube-api-access-sf6wj\") pod \"cert-manager-cainjector-cf98fcc89-6xsxb\" (UID: \"5f1f7118-2524-4653-9a60-82142d16ef44\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.509999 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4kx\" (UniqueName: \"kubernetes.io/projected/899495ee-adcf-4350-a1b3-6a3cdd8c9d42-kube-api-access-vt4kx\") pod \"cert-manager-webhook-687f57d79b-6wnjw\" (UID: \"899495ee-adcf-4350-a1b3-6a3cdd8c9d42\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.638974 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.660999 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-k9jtf" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.670498 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" Feb 25 11:05:04 crc kubenswrapper[4725]: I0225 11:05:04.904163 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb"] Feb 25 11:05:05 crc kubenswrapper[4725]: I0225 11:05:05.162091 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6wnjw"] Feb 25 11:05:05 crc kubenswrapper[4725]: W0225 11:05:05.166371 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod899495ee_adcf_4350_a1b3_6a3cdd8c9d42.slice/crio-adf40684fe55c4c92212fd338104d1c4fddc2e8d8074838b6e61d37ff553f578 WatchSource:0}: Error finding container adf40684fe55c4c92212fd338104d1c4fddc2e8d8074838b6e61d37ff553f578: Status 404 returned error can't find the container with id adf40684fe55c4c92212fd338104d1c4fddc2e8d8074838b6e61d37ff553f578 Feb 25 11:05:05 crc kubenswrapper[4725]: I0225 11:05:05.170900 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" event={"ID":"5f1f7118-2524-4653-9a60-82142d16ef44","Type":"ContainerStarted","Data":"0d324064d6ef1b12270d2ceea7c7f4359ea723da301261d810ac77d37d5b34d7"} Feb 25 11:05:05 crc kubenswrapper[4725]: I0225 11:05:05.185260 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-k9jtf"] Feb 25 11:05:05 crc kubenswrapper[4725]: W0225 11:05:05.185513 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a638a3_425e_4564_b5e1_b11c3d332ed6.slice/crio-00dca3cbe08b894c83ce86961822edc6f1d093979da3f09cd384be83014272cf WatchSource:0}: Error finding container 00dca3cbe08b894c83ce86961822edc6f1d093979da3f09cd384be83014272cf: Status 404 returned error can't find the container with id 00dca3cbe08b894c83ce86961822edc6f1d093979da3f09cd384be83014272cf Feb 25 11:05:06 crc kubenswrapper[4725]: I0225 11:05:06.178206 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-k9jtf" event={"ID":"32a638a3-425e-4564-b5e1-b11c3d332ed6","Type":"ContainerStarted","Data":"00dca3cbe08b894c83ce86961822edc6f1d093979da3f09cd384be83014272cf"} Feb 25 11:05:06 crc kubenswrapper[4725]: I0225 11:05:06.180331 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" event={"ID":"899495ee-adcf-4350-a1b3-6a3cdd8c9d42","Type":"ContainerStarted","Data":"adf40684fe55c4c92212fd338104d1c4fddc2e8d8074838b6e61d37ff553f578"} Feb 25 11:05:07 crc kubenswrapper[4725]: I0225 11:05:07.187544 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" event={"ID":"5f1f7118-2524-4653-9a60-82142d16ef44","Type":"ContainerStarted","Data":"4676a858659dec7c970bf51886446178e64d9b1da4e4e5ca25cab79ab8e355c7"} Feb 25 11:05:07 crc kubenswrapper[4725]: I0225 11:05:07.206152 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6xsxb" podStartSLOduration=1.148688568 podStartE2EDuration="3.206136494s" podCreationTimestamp="2026-02-25 11:05:04 +0000 UTC" firstStartedPulling="2026-02-25 11:05:04.916489697 +0000 UTC m=+730.415071722" lastFinishedPulling="2026-02-25 11:05:06.973937603 +0000 UTC m=+732.472519648" observedRunningTime="2026-02-25 11:05:07.201518977 +0000 UTC m=+732.700101002" watchObservedRunningTime="2026-02-25 11:05:07.206136494 +0000 UTC m=+732.704718509" Feb 25 11:05:09 crc kubenswrapper[4725]: I0225 11:05:09.206494 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-k9jtf" event={"ID":"32a638a3-425e-4564-b5e1-b11c3d332ed6","Type":"ContainerStarted","Data":"84dd5cc8f846c09df489f7565df805cc01ec8bc659a2ef07a8c49bb0063ff381"} Feb 25 11:05:09 crc kubenswrapper[4725]: I0225 11:05:09.210634 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" event={"ID":"899495ee-adcf-4350-a1b3-6a3cdd8c9d42","Type":"ContainerStarted","Data":"393055cf8508c52047036759ee5f47041937889e7e4fc8f7ca18709c14e0e0c9"} Feb 25 11:05:09 crc kubenswrapper[4725]: I0225 11:05:09.211663 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" Feb 25 11:05:09 crc kubenswrapper[4725]: I0225 11:05:09.222292 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-k9jtf" podStartSLOduration=1.701337118 podStartE2EDuration="5.222273287s" podCreationTimestamp="2026-02-25 11:05:04 +0000 UTC" firstStartedPulling="2026-02-25 11:05:05.18856218 +0000 UTC m=+730.687144245" lastFinishedPulling="2026-02-25 11:05:08.709498389 +0000 UTC m=+734.208080414" observedRunningTime="2026-02-25 11:05:09.220045596 +0000 UTC m=+734.718627661" watchObservedRunningTime="2026-02-25 11:05:09.222273287 +0000 UTC m=+734.720855342" Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.875504 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" podStartSLOduration=5.276185354 podStartE2EDuration="8.875473829s" podCreationTimestamp="2026-02-25 11:05:04 +0000 UTC" firstStartedPulling="2026-02-25 11:05:05.169028175 +0000 UTC m=+730.667610240" lastFinishedPulling="2026-02-25 11:05:08.76831665 +0000 UTC m=+734.266898715" observedRunningTime="2026-02-25 11:05:09.239938021 +0000 UTC m=+734.738520106" watchObservedRunningTime="2026-02-25 11:05:12.875473829 +0000 UTC m=+738.374055884" Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.880687 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6klc9"] Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.881748 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-controller" containerID="cri-o://59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" gracePeriod=30 Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.881808 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="nbdb" containerID="cri-o://3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" gracePeriod=30 Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.881926 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="northd" containerID="cri-o://c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" gracePeriod=30 Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.881883 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" gracePeriod=30 Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.882060 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-acl-logging" containerID="cri-o://c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" gracePeriod=30 Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.882074 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="sbdb" containerID="cri-o://851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" gracePeriod=30 Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.882170 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-node" containerID="cri-o://87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" gracePeriod=30 Feb 25 11:05:12 crc kubenswrapper[4725]: I0225 11:05:12.945036 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" containerID="cri-o://199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" gracePeriod=30 Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.059355 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-conmon-4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-conmon-87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-conmon-199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a39624_e0d8_44dc_9596_cd7224f58d5d.slice/crio-conmon-c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df.scope\": RecentStats: unable to find data in memory cache]" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.250137 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/3.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.251084 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovnkube-controller/3.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.254079 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovn-acl-logging/0.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.254806 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovn-controller/0.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.255095 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovn-acl-logging/0.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.255575 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6klc9_07a39624-e0d8-44dc-9596-cd7224f58d5d/ovn-controller/0.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.255584 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256083 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" exitCode=0 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256123 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" exitCode=0 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256144 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" exitCode=0 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256150 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256164 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" exitCode=0 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256181 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" exitCode=0 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256197 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256213 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256225 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256236 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256249 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256262 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256273 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256279 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256285 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256291 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256297 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256303 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256309 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256198 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" exitCode=0 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256347 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" exitCode=143 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256370 4725 generic.go:334] "Generic (PLEG): container finished" podID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerID="59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" exitCode=143 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256315 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256410 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256299 4725 scope.go:117] "RemoveContainer" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256446 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256464 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256478 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256490 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256502 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256514 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256525 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256537 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256548 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256560 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256577 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256595 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256608 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256619 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256631 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256643 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256655 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256667 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256678 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256689 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256810 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256857 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" event={"ID":"07a39624-e0d8-44dc-9596-cd7224f58d5d","Type":"ContainerDied","Data":"6e754f79b582e88daaa8265d5628448ee5846cd084b944e9b061e538e4054258"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256877 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256892 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256907 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256919 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256932 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256944 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256958 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256969 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256980 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.256991 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.261170 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/2.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.261684 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/1.log" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.261799 4725 generic.go:334] "Generic (PLEG): container finished" podID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" containerID="e36f678444c7f8932a1272a93dd2c22ee7a9de5680524aba427e492321e3c745" exitCode=2 Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.261848 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerDied","Data":"e36f678444c7f8932a1272a93dd2c22ee7a9de5680524aba427e492321e3c745"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.262022 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3"} Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.262471 4725 scope.go:117] "RemoveContainer" containerID="e36f678444c7f8932a1272a93dd2c22ee7a9de5680524aba427e492321e3c745" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.265313 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d6b9f_openshift-multus(7fb276f6-5e43-4b04-a290-42bfdc3b1125)\"" pod="openshift-multus/multus-d6b9f" podUID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.278055 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.301280 4725 scope.go:117] "RemoveContainer" containerID="851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314203 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b4mb9"] Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314494 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kubecfg-setup" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314518 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kubecfg-setup" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314538 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314545 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314556 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314563 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314571 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314578 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314587 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-node" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314593 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-node" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314601 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="nbdb" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314606 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="nbdb" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314615 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="sbdb" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314620 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="sbdb" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314631 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-acl-logging" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314647 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-acl-logging" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314656 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314662 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314668 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314675 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314685 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314692 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.314713 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="northd" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314720 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="northd" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314875 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="nbdb" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.314893 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315756 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315770 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-acl-logging" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315779 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovn-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315787 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315797 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-node" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315847 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="sbdb" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315856 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315866 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.315894 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="northd" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.315994 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.316002 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.316092 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" containerName="ovnkube-controller" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.317559 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.329363 4725 scope.go:117] "RemoveContainer" containerID="3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.344810 4725 scope.go:117] "RemoveContainer" containerID="c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.362891 4725 scope.go:117] "RemoveContainer" containerID="4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.375413 4725 scope.go:117] "RemoveContainer" containerID="87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.389444 4725 scope.go:117] "RemoveContainer" containerID="c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.404722 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-ovn\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.404953 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-kubelet\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405079 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-netd\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405228 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-node-log\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.404867 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405009 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405156 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405391 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-node-log" (OuterVolumeSpecName: "node-log") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405570 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-config\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405695 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-openvswitch\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405777 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.405944 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovn-node-metrics-cert\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.406073 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-ovn-kubernetes\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.406326 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-etc-openvswitch\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.406118 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.406155 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.406479 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.406729 4725 scope.go:117] "RemoveContainer" containerID="59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.406935 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hct4s\" (UniqueName: \"kubernetes.io/projected/07a39624-e0d8-44dc-9596-cd7224f58d5d-kube-api-access-hct4s\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.407707 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.407945 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-script-lib\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.407781 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408194 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408072 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-var-lib-openvswitch\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408426 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-systemd\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408319 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408547 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-log-socket\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408813 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-slash\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408949 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-netns\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409058 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-bin\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409165 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-env-overrides\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408713 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-log-socket" (OuterVolumeSpecName: "log-socket") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.408872 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-slash" (OuterVolumeSpecName: "host-slash") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409280 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409247 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409397 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409463 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-systemd-units\") pod \"07a39624-e0d8-44dc-9596-cd7224f58d5d\" (UID: \"07a39624-e0d8-44dc-9596-cd7224f58d5d\") " Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409789 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409955 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-ovnkube-script-lib\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.410068 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-etc-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.410206 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-systemd-units\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.410339 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-slash\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.410475 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-env-overrides\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.409855 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.410601 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-ovn\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.410928 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411086 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-node-log\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411240 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-log-socket\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411336 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qdf\" (UniqueName: \"kubernetes.io/projected/f1a696ff-eff0-45b5-886a-e721a73a8037-kube-api-access-48qdf\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411476 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-run-netns\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411685 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-kubelet\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411752 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-systemd\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411786 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-cni-netd\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411816 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a696ff-eff0-45b5-886a-e721a73a8037-ovn-node-metrics-cert\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411886 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-cni-bin\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.411923 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-ovnkube-config\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412023 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412062 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-var-lib-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412127 4725 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412148 4725 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412166 4725 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412182 4725 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412198 4725 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412215 4725 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-node-log\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412231 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412246 4725 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412265 4725 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412284 4725 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412301 4725 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412319 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412337 4725 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412351 4725 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-log-socket\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412366 4725 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-slash\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412380 4725 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.412395 4725 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.413695 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a39624-e0d8-44dc-9596-cd7224f58d5d-kube-api-access-hct4s" (OuterVolumeSpecName: "kube-api-access-hct4s") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "kube-api-access-hct4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.414168 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.420981 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "07a39624-e0d8-44dc-9596-cd7224f58d5d" (UID: "07a39624-e0d8-44dc-9596-cd7224f58d5d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.423641 4725 scope.go:117] "RemoveContainer" containerID="70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.441522 4725 scope.go:117] "RemoveContainer" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.442052 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": container with ID starting with 199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860 not found: ID does not exist" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.442125 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} err="failed to get container status \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": rpc error: code = NotFound desc = could not find container \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": container with ID starting with 199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.442169 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.442796 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": container with ID starting with dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14 not found: ID does not exist" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.442883 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} err="failed to get container status \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": rpc error: code = NotFound desc = could not find container \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": container with ID starting with dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.442915 4725 scope.go:117] "RemoveContainer" containerID="851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.444028 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": container with ID starting with 851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac not found: ID does not exist" containerID="851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.444067 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} err="failed to get container status \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": rpc error: code = NotFound desc = could not find container \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": container with ID starting with 851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.444090 4725 scope.go:117] "RemoveContainer" containerID="3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.444449 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": container with ID starting with 3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd not found: ID does not exist" containerID="3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.444497 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} err="failed to get container status \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": rpc error: code = NotFound desc = could not find container \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": container with ID starting with 3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.444532 4725 scope.go:117] "RemoveContainer" containerID="c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.444861 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": container with ID starting with c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc not found: ID does not exist" containerID="c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.444900 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} err="failed to get container status \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": rpc error: code = NotFound desc = could not find container \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": container with ID starting with c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.444922 4725 scope.go:117] "RemoveContainer" containerID="4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.446094 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": container with ID starting with 4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e not found: ID does not exist" containerID="4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.446141 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} err="failed to get container status \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": rpc error: code = NotFound desc = could not find container \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": container with ID starting with 4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.446169 4725 scope.go:117] "RemoveContainer" containerID="87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.446537 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": container with ID starting with 87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5 not found: ID does not exist" containerID="87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.446574 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} err="failed to get container status \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": rpc error: code = NotFound desc = could not find container \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": container with ID starting with 87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.446593 4725 scope.go:117] "RemoveContainer" containerID="c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.446791 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": container with ID starting with c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df not found: ID does not exist" containerID="c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.446812 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} err="failed to get container status \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": rpc error: code = NotFound desc = could not find container \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": container with ID starting with c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.446846 4725 scope.go:117] "RemoveContainer" containerID="59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.447339 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": container with ID starting with 59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f not found: ID does not exist" containerID="59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.447363 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} err="failed to get container status \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": rpc error: code = NotFound desc = could not find container \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": container with ID starting with 59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.447381 4725 scope.go:117] "RemoveContainer" containerID="70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b" Feb 25 11:05:13 crc kubenswrapper[4725]: E0225 11:05:13.447875 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": container with ID starting with 70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b not found: ID does not exist" containerID="70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.447910 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} err="failed to get container status \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": rpc error: code = NotFound desc = could not find container \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": container with ID starting with 70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.447930 4725 scope.go:117] "RemoveContainer" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.448178 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} err="failed to get container status \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": rpc error: code = NotFound desc = could not find container \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": container with ID starting with 199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.448202 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.448429 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} err="failed to get container status \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": rpc error: code = NotFound desc = could not find container \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": container with ID starting with dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.448530 4725 scope.go:117] "RemoveContainer" containerID="851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.448897 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} err="failed to get container status \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": rpc error: code = NotFound desc = could not find container \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": container with ID starting with 851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.448924 4725 scope.go:117] "RemoveContainer" containerID="3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.449169 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} err="failed to get container status \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": rpc error: code = NotFound desc = could not find container \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": container with ID starting with 3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.449262 4725 scope.go:117] "RemoveContainer" containerID="c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.449544 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} err="failed to get container status \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": rpc error: code = NotFound desc = could not find container \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": container with ID starting with c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.449570 4725 scope.go:117] "RemoveContainer" containerID="4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.449807 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} err="failed to get container status \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": rpc error: code = NotFound desc = could not find container \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": container with ID starting with 4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.449843 4725 scope.go:117] "RemoveContainer" containerID="87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.450053 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} err="failed to get container status \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": rpc error: code = NotFound desc = could not find container \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": container with ID starting with 87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.450141 4725 scope.go:117] "RemoveContainer" containerID="c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.450413 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} err="failed to get container status \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": rpc error: code = NotFound desc = could not find container \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": container with ID starting with c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.450433 4725 scope.go:117] "RemoveContainer" containerID="59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.450647 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} err="failed to get container status \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": rpc error: code = NotFound desc = could not find container \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": container with ID starting with 59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.450667 4725 scope.go:117] "RemoveContainer" containerID="70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.450936 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} err="failed to get container status \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": rpc error: code = NotFound desc = could not find container \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": container with ID starting with 70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451031 4725 scope.go:117] "RemoveContainer" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451327 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} err="failed to get container status \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": rpc error: code = NotFound desc = could not find container \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": container with ID starting with 199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451356 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451540 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} err="failed to get container status \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": rpc error: code = NotFound desc = could not find container \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": container with ID starting with dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451560 4725 scope.go:117] "RemoveContainer" containerID="851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451757 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} err="failed to get container status \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": rpc error: code = NotFound desc = could not find container \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": container with ID starting with 851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451781 4725 scope.go:117] "RemoveContainer" containerID="3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451952 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} err="failed to get container status \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": rpc error: code = NotFound desc = could not find container \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": container with ID starting with 3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.451978 4725 scope.go:117] "RemoveContainer" containerID="c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.452185 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} err="failed to get container status \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": rpc error: code = NotFound desc = could not find container \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": container with ID starting with c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.452276 4725 scope.go:117] "RemoveContainer" containerID="4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.452603 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} err="failed to get container status \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": rpc error: code = NotFound desc = could not find container \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": container with ID starting with 4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.452633 4725 scope.go:117] "RemoveContainer" containerID="87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.452819 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} err="failed to get container status \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": rpc error: code = NotFound desc = could not find container \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": container with ID starting with 87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.452855 4725 scope.go:117] "RemoveContainer" containerID="c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.453051 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} err="failed to get container status \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": rpc error: code = NotFound desc = could not find container \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": container with ID starting with c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.453140 4725 scope.go:117] "RemoveContainer" containerID="59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.453437 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} err="failed to get container status \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": rpc error: code = NotFound desc = could not find container \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": container with ID starting with 59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.453459 4725 scope.go:117] "RemoveContainer" containerID="70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.453692 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} err="failed to get container status \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": rpc error: code = NotFound desc = could not find container \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": container with ID starting with 70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.453737 4725 scope.go:117] "RemoveContainer" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.453985 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} err="failed to get container status \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": rpc error: code = NotFound desc = could not find container \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": container with ID starting with 199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.454003 4725 scope.go:117] "RemoveContainer" containerID="dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.454362 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14"} err="failed to get container status \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": rpc error: code = NotFound desc = could not find container \"dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14\": container with ID starting with dd575379ed8ce0e9ea1f05a2ba294d9d8a0b34d5754860fdcaaba235221add14 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.454397 4725 scope.go:117] "RemoveContainer" containerID="851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.454748 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac"} err="failed to get container status \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": rpc error: code = NotFound desc = could not find container \"851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac\": container with ID starting with 851856b0da3ee2a7e1db4401a159ee60c8bf4ee68a3c0c2013413cd541d8f4ac not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.454897 4725 scope.go:117] "RemoveContainer" containerID="3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455257 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd"} err="failed to get container status \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": rpc error: code = NotFound desc = could not find container \"3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd\": container with ID starting with 3e4fc0075ca762d6a0df3c873610a3d0d2597ce89e844aac57c7d3e475d124dd not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455291 4725 scope.go:117] "RemoveContainer" containerID="c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455514 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc"} err="failed to get container status \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": rpc error: code = NotFound desc = could not find container \"c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc\": container with ID starting with c94f30dff0cb42aaff2a5c5ea588df0ae8f1ec4837ecd04187910727103d01bc not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455535 4725 scope.go:117] "RemoveContainer" containerID="4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455694 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e"} err="failed to get container status \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": rpc error: code = NotFound desc = could not find container \"4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e\": container with ID starting with 4465110a798b5060f23d205e80b68087b2a8dcfe3e1813b39c2430ef6a89b03e not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455714 4725 scope.go:117] "RemoveContainer" containerID="87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455955 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5"} err="failed to get container status \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": rpc error: code = NotFound desc = could not find container \"87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5\": container with ID starting with 87c59cc34599fa8573ff9dd4d05981146ec8657864efa4dc7d289c33704b11c5 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.455989 4725 scope.go:117] "RemoveContainer" containerID="c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.456256 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df"} err="failed to get container status \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": rpc error: code = NotFound desc = could not find container \"c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df\": container with ID starting with c891612c073d68352900178d8ecf04a9472c3ca1c41b0843572845020f7244df not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.456277 4725 scope.go:117] "RemoveContainer" containerID="59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.456496 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f"} err="failed to get container status \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": rpc error: code = NotFound desc = could not find container \"59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f\": container with ID starting with 59b5833fa539982b866f6f70dd0ee397f5d3b23a1c5a7ad1165f6527c4c4672f not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.456532 4725 scope.go:117] "RemoveContainer" containerID="70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.456892 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b"} err="failed to get container status \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": rpc error: code = NotFound desc = could not find container \"70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b\": container with ID starting with 70b247f6c25aa1844400df1ade575715cc659c1d0697bdeca4f4dc5e8e15d81b not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.456988 4725 scope.go:117] "RemoveContainer" containerID="199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.457483 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860"} err="failed to get container status \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": rpc error: code = NotFound desc = could not find container \"199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860\": container with ID starting with 199967102f62641c6b0d7d7d1c3f9677775e4042c54338f17b16c06c23926860 not found: ID does not exist" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.513753 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.514183 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-var-lib-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.514571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.514776 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-ovnkube-script-lib\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.515007 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-etc-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.514651 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.515368 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-systemd-units\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.513941 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.515104 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-etc-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.514317 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-var-lib-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.515562 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-ovnkube-script-lib\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.515745 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-systemd-units\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.515970 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-slash\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516124 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-env-overrides\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516268 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-ovn\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516401 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516356 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-ovn\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-openvswitch\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516067 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-slash\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516472 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-env-overrides\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516691 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-node-log\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.516575 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-node-log\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.517465 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-log-socket\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.517726 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qdf\" (UniqueName: \"kubernetes.io/projected/f1a696ff-eff0-45b5-886a-e721a73a8037-kube-api-access-48qdf\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.517662 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-log-socket\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.518587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-run-netns\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.518782 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-run-netns\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519159 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-kubelet\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519340 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-systemd\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-cni-netd\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519628 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a696ff-eff0-45b5-886a-e721a73a8037-ovn-node-metrics-cert\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519791 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-cni-bin\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519974 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-ovnkube-config\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519400 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-run-systemd\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519258 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-kubelet\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519880 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-cni-bin\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.519552 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a696ff-eff0-45b5-886a-e721a73a8037-host-cni-netd\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.520408 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a696ff-eff0-45b5-886a-e721a73a8037-ovnkube-config\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.520630 4725 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a39624-e0d8-44dc-9596-cd7224f58d5d-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.520651 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a39624-e0d8-44dc-9596-cd7224f58d5d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.520663 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hct4s\" (UniqueName: \"kubernetes.io/projected/07a39624-e0d8-44dc-9596-cd7224f58d5d-kube-api-access-hct4s\") on node \"crc\" DevicePath \"\"" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.524867 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a696ff-eff0-45b5-886a-e721a73a8037-ovn-node-metrics-cert\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.548000 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qdf\" (UniqueName: \"kubernetes.io/projected/f1a696ff-eff0-45b5-886a-e721a73a8037-kube-api-access-48qdf\") pod \"ovnkube-node-b4mb9\" (UID: \"f1a696ff-eff0-45b5-886a-e721a73a8037\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: I0225 11:05:13.643695 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:13 crc kubenswrapper[4725]: W0225 11:05:13.673142 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a696ff_eff0_45b5_886a_e721a73a8037.slice/crio-7f40aec321eba1c6c9ff8b8476e6c8cbc7aba14605c15d4799522e6b23873a37 WatchSource:0}: Error finding container 7f40aec321eba1c6c9ff8b8476e6c8cbc7aba14605c15d4799522e6b23873a37: Status 404 returned error can't find the container with id 7f40aec321eba1c6c9ff8b8476e6c8cbc7aba14605c15d4799522e6b23873a37 Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.270473 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6klc9" Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.272038 4725 generic.go:334] "Generic (PLEG): container finished" podID="f1a696ff-eff0-45b5-886a-e721a73a8037" containerID="8c07b3cc0558d81c963641074a80188089a13b90c7a454d294c73696ed830b50" exitCode=0 Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.272168 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerDied","Data":"8c07b3cc0558d81c963641074a80188089a13b90c7a454d294c73696ed830b50"} Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.272488 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"7f40aec321eba1c6c9ff8b8476e6c8cbc7aba14605c15d4799522e6b23873a37"} Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.332745 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6klc9"] Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.348261 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6klc9"] Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.675320 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-6wnjw" Feb 25 11:05:14 crc kubenswrapper[4725]: I0225 11:05:14.931394 4725 scope.go:117] "RemoveContainer" containerID="450f667a90ee81126322c6369c4c923f659d0169304a9297898be1efc1baaea3" Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.239001 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a39624-e0d8-44dc-9596-cd7224f58d5d" path="/var/lib/kubelet/pods/07a39624-e0d8-44dc-9596-cd7224f58d5d/volumes" Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.282374 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"eeedc1e654f6df688fbb4bea5c4eab0ef1d3d579c5be9a1d15b02302cbd63693"} Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.282431 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"430efb70bc7a293b8bf179fe722af1caef21e1ed36360f4aec77b764e02fd66f"} Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.282457 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"b8535be223d857e89be5ec6282f010618c336948bfd2490a09268d66794bb1de"} Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.282475 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"f4b4472856e12ef2e3508f6be5b6e68c9d3129f36ea526b2c37b4b7f9e647fc4"} Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.282492 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"10ccff6d05072e48e601267df28089e396741f3bf23529a0b140cca0f1e99e5f"} Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.282507 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"861f73a0691c3dd62291d389109303b8abbc98310bc00aa6be29b7065fce9e69"} Feb 25 11:05:15 crc kubenswrapper[4725]: I0225 11:05:15.284633 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/2.log" Feb 25 11:05:18 crc kubenswrapper[4725]: I0225 11:05:18.308448 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"74b9cdf77fb387d7ab468a169a9010a3d86f530be0bed6008a0063801b95e081"} Feb 25 11:05:20 crc kubenswrapper[4725]: I0225 11:05:20.324491 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" event={"ID":"f1a696ff-eff0-45b5-886a-e721a73a8037","Type":"ContainerStarted","Data":"c2f9a481201932b46c94d5e1c60b8710d134411997395eb6a3afbe1948c12feb"} Feb 25 11:05:20 crc kubenswrapper[4725]: I0225 11:05:20.324812 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:20 crc kubenswrapper[4725]: I0225 11:05:20.324846 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:20 crc kubenswrapper[4725]: I0225 11:05:20.324860 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:20 crc kubenswrapper[4725]: I0225 11:05:20.352036 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" podStartSLOduration=7.352016003 podStartE2EDuration="7.352016003s" podCreationTimestamp="2026-02-25 11:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:05:20.347853319 +0000 UTC m=+745.846435374" watchObservedRunningTime="2026-02-25 11:05:20.352016003 +0000 UTC m=+745.850598048" Feb 25 11:05:20 crc kubenswrapper[4725]: I0225 11:05:20.367087 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:20 crc kubenswrapper[4725]: I0225 11:05:20.367589 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:24 crc kubenswrapper[4725]: I0225 11:05:24.223927 4725 scope.go:117] "RemoveContainer" containerID="e36f678444c7f8932a1272a93dd2c22ee7a9de5680524aba427e492321e3c745" Feb 25 11:05:24 crc kubenswrapper[4725]: E0225 11:05:24.224294 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d6b9f_openshift-multus(7fb276f6-5e43-4b04-a290-42bfdc3b1125)\"" pod="openshift-multus/multus-d6b9f" podUID="7fb276f6-5e43-4b04-a290-42bfdc3b1125" Feb 25 11:05:36 crc kubenswrapper[4725]: I0225 11:05:36.224982 4725 scope.go:117] "RemoveContainer" containerID="e36f678444c7f8932a1272a93dd2c22ee7a9de5680524aba427e492321e3c745" Feb 25 11:05:37 crc kubenswrapper[4725]: I0225 11:05:37.447953 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6b9f_7fb276f6-5e43-4b04-a290-42bfdc3b1125/kube-multus/2.log" Feb 25 11:05:37 crc kubenswrapper[4725]: I0225 11:05:37.448256 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6b9f" event={"ID":"7fb276f6-5e43-4b04-a290-42bfdc3b1125","Type":"ContainerStarted","Data":"4b5be73c8a440cbe7015b6d549b254a1df0e8ec6f69bc28fb3827c5635f19132"} Feb 25 11:05:43 crc kubenswrapper[4725]: I0225 11:05:43.680503 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4mb9" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.431969 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d"] Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.434218 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.436909 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.442536 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d"] Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.574335 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.574420 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.574510 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lctr\" (UniqueName: \"kubernetes.io/projected/6b4bc033-9181-40c7-8264-19b5a49c8e7f-kube-api-access-4lctr\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.675342 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lctr\" (UniqueName: \"kubernetes.io/projected/6b4bc033-9181-40c7-8264-19b5a49c8e7f-kube-api-access-4lctr\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.675421 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.675441 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.675816 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.676272 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.706489 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lctr\" (UniqueName: \"kubernetes.io/projected/6b4bc033-9181-40c7-8264-19b5a49c8e7f-kube-api-access-4lctr\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:53 crc kubenswrapper[4725]: I0225 11:05:53.754995 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:54 crc kubenswrapper[4725]: I0225 11:05:54.017873 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d"] Feb 25 11:05:54 crc kubenswrapper[4725]: W0225 11:05:54.025685 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4bc033_9181_40c7_8264_19b5a49c8e7f.slice/crio-9d98894aad94870d299cd5b2db6bf1c7a40bc534176bc0aac0d8344e785500e9 WatchSource:0}: Error finding container 9d98894aad94870d299cd5b2db6bf1c7a40bc534176bc0aac0d8344e785500e9: Status 404 returned error can't find the container with id 9d98894aad94870d299cd5b2db6bf1c7a40bc534176bc0aac0d8344e785500e9 Feb 25 11:05:54 crc kubenswrapper[4725]: I0225 11:05:54.560421 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" event={"ID":"6b4bc033-9181-40c7-8264-19b5a49c8e7f","Type":"ContainerStarted","Data":"c9223fae819b690e05adace8dd45754cd0f58e74616d60cfdc36f5b44578af67"} Feb 25 11:05:54 crc kubenswrapper[4725]: I0225 11:05:54.560695 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" event={"ID":"6b4bc033-9181-40c7-8264-19b5a49c8e7f","Type":"ContainerStarted","Data":"9d98894aad94870d299cd5b2db6bf1c7a40bc534176bc0aac0d8344e785500e9"} Feb 25 11:05:55 crc kubenswrapper[4725]: I0225 11:05:55.566673 4725 generic.go:334] "Generic (PLEG): container finished" podID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerID="c9223fae819b690e05adace8dd45754cd0f58e74616d60cfdc36f5b44578af67" exitCode=0 Feb 25 11:05:55 crc kubenswrapper[4725]: I0225 11:05:55.566751 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" event={"ID":"6b4bc033-9181-40c7-8264-19b5a49c8e7f","Type":"ContainerDied","Data":"c9223fae819b690e05adace8dd45754cd0f58e74616d60cfdc36f5b44578af67"} Feb 25 11:05:57 crc kubenswrapper[4725]: I0225 11:05:57.581654 4725 generic.go:334] "Generic (PLEG): container finished" podID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerID="587c37213e9de4fe43c4cab9b6e77cc93799db62175b131efc562fcf40614e6e" exitCode=0 Feb 25 11:05:57 crc kubenswrapper[4725]: I0225 11:05:57.581723 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" event={"ID":"6b4bc033-9181-40c7-8264-19b5a49c8e7f","Type":"ContainerDied","Data":"587c37213e9de4fe43c4cab9b6e77cc93799db62175b131efc562fcf40614e6e"} Feb 25 11:05:58 crc kubenswrapper[4725]: I0225 11:05:58.609261 4725 generic.go:334] "Generic (PLEG): container finished" podID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerID="d440a3e4adb952c82116d2e54ac07226db638920d13482c1304808ffb7bb78e2" exitCode=0 Feb 25 11:05:58 crc kubenswrapper[4725]: I0225 11:05:58.609596 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" event={"ID":"6b4bc033-9181-40c7-8264-19b5a49c8e7f","Type":"ContainerDied","Data":"d440a3e4adb952c82116d2e54ac07226db638920d13482c1304808ffb7bb78e2"} Feb 25 11:05:59 crc kubenswrapper[4725]: I0225 11:05:59.969493 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:05:59 crc kubenswrapper[4725]: I0225 11:05:59.972306 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lctr\" (UniqueName: \"kubernetes.io/projected/6b4bc033-9181-40c7-8264-19b5a49c8e7f-kube-api-access-4lctr\") pod \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " Feb 25 11:05:59 crc kubenswrapper[4725]: I0225 11:05:59.972379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-util\") pod \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " Feb 25 11:05:59 crc kubenswrapper[4725]: I0225 11:05:59.972408 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-bundle\") pod \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\" (UID: \"6b4bc033-9181-40c7-8264-19b5a49c8e7f\") " Feb 25 11:05:59 crc kubenswrapper[4725]: I0225 11:05:59.973025 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-bundle" (OuterVolumeSpecName: "bundle") pod "6b4bc033-9181-40c7-8264-19b5a49c8e7f" (UID: "6b4bc033-9181-40c7-8264-19b5a49c8e7f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:05:59 crc kubenswrapper[4725]: I0225 11:05:59.981301 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4bc033-9181-40c7-8264-19b5a49c8e7f-kube-api-access-4lctr" (OuterVolumeSpecName: "kube-api-access-4lctr") pod "6b4bc033-9181-40c7-8264-19b5a49c8e7f" (UID: "6b4bc033-9181-40c7-8264-19b5a49c8e7f"). InnerVolumeSpecName "kube-api-access-4lctr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:05:59 crc kubenswrapper[4725]: I0225 11:05:59.982857 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-util" (OuterVolumeSpecName: "util") pod "6b4bc033-9181-40c7-8264-19b5a49c8e7f" (UID: "6b4bc033-9181-40c7-8264-19b5a49c8e7f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.073379 4725 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-util\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.073411 4725 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4bc033-9181-40c7-8264-19b5a49c8e7f-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.073421 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lctr\" (UniqueName: \"kubernetes.io/projected/6b4bc033-9181-40c7-8264-19b5a49c8e7f-kube-api-access-4lctr\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.139895 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533626-fvvbl"] Feb 25 11:06:00 crc kubenswrapper[4725]: E0225 11:06:00.140194 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerName="pull" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.140216 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerName="pull" Feb 25 11:06:00 crc kubenswrapper[4725]: E0225 11:06:00.140230 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerName="util" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.140240 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerName="util" Feb 25 11:06:00 crc kubenswrapper[4725]: E0225 11:06:00.140255 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerName="extract" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.140267 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerName="extract" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.140415 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4bc033-9181-40c7-8264-19b5a49c8e7f" containerName="extract" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.140913 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533626-fvvbl" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.142916 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.143060 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.145706 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.147387 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533626-fvvbl"] Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.276033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzblm\" (UniqueName: \"kubernetes.io/projected/ddaa0363-1011-45ef-9e91-11054f3cb3c1-kube-api-access-dzblm\") pod \"auto-csr-approver-29533626-fvvbl\" (UID: \"ddaa0363-1011-45ef-9e91-11054f3cb3c1\") " pod="openshift-infra/auto-csr-approver-29533626-fvvbl" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.377931 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzblm\" (UniqueName: \"kubernetes.io/projected/ddaa0363-1011-45ef-9e91-11054f3cb3c1-kube-api-access-dzblm\") pod \"auto-csr-approver-29533626-fvvbl\" (UID: \"ddaa0363-1011-45ef-9e91-11054f3cb3c1\") " pod="openshift-infra/auto-csr-approver-29533626-fvvbl" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.409001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzblm\" (UniqueName: \"kubernetes.io/projected/ddaa0363-1011-45ef-9e91-11054f3cb3c1-kube-api-access-dzblm\") pod \"auto-csr-approver-29533626-fvvbl\" (UID: \"ddaa0363-1011-45ef-9e91-11054f3cb3c1\") " pod="openshift-infra/auto-csr-approver-29533626-fvvbl" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.463061 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533626-fvvbl" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.625023 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" event={"ID":"6b4bc033-9181-40c7-8264-19b5a49c8e7f","Type":"ContainerDied","Data":"9d98894aad94870d299cd5b2db6bf1c7a40bc534176bc0aac0d8344e785500e9"} Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.625075 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d98894aad94870d299cd5b2db6bf1c7a40bc534176bc0aac0d8344e785500e9" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.625209 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d" Feb 25 11:06:00 crc kubenswrapper[4725]: I0225 11:06:00.718691 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533626-fvvbl"] Feb 25 11:06:00 crc kubenswrapper[4725]: W0225 11:06:00.724166 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddaa0363_1011_45ef_9e91_11054f3cb3c1.slice/crio-5cc893ebacccb6c4dda56b734547659f57897a9a01b03f332b3d94b1e98b817c WatchSource:0}: Error finding container 5cc893ebacccb6c4dda56b734547659f57897a9a01b03f332b3d94b1e98b817c: Status 404 returned error can't find the container with id 5cc893ebacccb6c4dda56b734547659f57897a9a01b03f332b3d94b1e98b817c Feb 25 11:06:01 crc kubenswrapper[4725]: I0225 11:06:01.633121 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533626-fvvbl" event={"ID":"ddaa0363-1011-45ef-9e91-11054f3cb3c1","Type":"ContainerStarted","Data":"5cc893ebacccb6c4dda56b734547659f57897a9a01b03f332b3d94b1e98b817c"} Feb 25 11:06:02 crc kubenswrapper[4725]: I0225 11:06:02.642671 4725 generic.go:334] "Generic (PLEG): container finished" podID="ddaa0363-1011-45ef-9e91-11054f3cb3c1" containerID="1b53b06edbd047e4e3ab11814740bc96fd5e59649ae99a4c964eaa08647244f8" exitCode=0 Feb 25 11:06:02 crc kubenswrapper[4725]: I0225 11:06:02.642727 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533626-fvvbl" event={"ID":"ddaa0363-1011-45ef-9e91-11054f3cb3c1","Type":"ContainerDied","Data":"1b53b06edbd047e4e3ab11814740bc96fd5e59649ae99a4c964eaa08647244f8"} Feb 25 11:06:03 crc kubenswrapper[4725]: I0225 11:06:03.939986 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533626-fvvbl" Feb 25 11:06:04 crc kubenswrapper[4725]: I0225 11:06:04.040641 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzblm\" (UniqueName: \"kubernetes.io/projected/ddaa0363-1011-45ef-9e91-11054f3cb3c1-kube-api-access-dzblm\") pod \"ddaa0363-1011-45ef-9e91-11054f3cb3c1\" (UID: \"ddaa0363-1011-45ef-9e91-11054f3cb3c1\") " Feb 25 11:06:04 crc kubenswrapper[4725]: I0225 11:06:04.045777 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddaa0363-1011-45ef-9e91-11054f3cb3c1-kube-api-access-dzblm" (OuterVolumeSpecName: "kube-api-access-dzblm") pod "ddaa0363-1011-45ef-9e91-11054f3cb3c1" (UID: "ddaa0363-1011-45ef-9e91-11054f3cb3c1"). InnerVolumeSpecName "kube-api-access-dzblm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:06:04 crc kubenswrapper[4725]: I0225 11:06:04.142546 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzblm\" (UniqueName: \"kubernetes.io/projected/ddaa0363-1011-45ef-9e91-11054f3cb3c1-kube-api-access-dzblm\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:04 crc kubenswrapper[4725]: I0225 11:06:04.658179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533626-fvvbl" event={"ID":"ddaa0363-1011-45ef-9e91-11054f3cb3c1","Type":"ContainerDied","Data":"5cc893ebacccb6c4dda56b734547659f57897a9a01b03f332b3d94b1e98b817c"} Feb 25 11:06:04 crc kubenswrapper[4725]: I0225 11:06:04.658245 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533626-fvvbl" Feb 25 11:06:04 crc kubenswrapper[4725]: I0225 11:06:04.658253 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cc893ebacccb6c4dda56b734547659f57897a9a01b03f332b3d94b1e98b817c" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.027291 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533620-79r87"] Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.034888 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533620-79r87"] Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.061407 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rrwts"] Feb 25 11:06:05 crc kubenswrapper[4725]: E0225 11:06:05.061608 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddaa0363-1011-45ef-9e91-11054f3cb3c1" containerName="oc" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.061618 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddaa0363-1011-45ef-9e91-11054f3cb3c1" containerName="oc" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.061712 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddaa0363-1011-45ef-9e91-11054f3cb3c1" containerName="oc" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.062056 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.065541 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.065583 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.065695 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zt7bv" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.082128 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rrwts"] Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.232564 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94932c77-7581-4291-bb30-55e751a0923c" path="/var/lib/kubelet/pods/94932c77-7581-4291-bb30-55e751a0923c/volumes" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.255990 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k24ff\" (UniqueName: \"kubernetes.io/projected/7d7a9448-ae03-426a-8c08-5823c6097b8c-kube-api-access-k24ff\") pod \"nmstate-operator-694c9596b7-rrwts\" (UID: \"7d7a9448-ae03-426a-8c08-5823c6097b8c\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.357246 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k24ff\" (UniqueName: \"kubernetes.io/projected/7d7a9448-ae03-426a-8c08-5823c6097b8c-kube-api-access-k24ff\") pod \"nmstate-operator-694c9596b7-rrwts\" (UID: \"7d7a9448-ae03-426a-8c08-5823c6097b8c\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.377400 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k24ff\" (UniqueName: \"kubernetes.io/projected/7d7a9448-ae03-426a-8c08-5823c6097b8c-kube-api-access-k24ff\") pod \"nmstate-operator-694c9596b7-rrwts\" (UID: \"7d7a9448-ae03-426a-8c08-5823c6097b8c\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.675789 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" Feb 25 11:06:05 crc kubenswrapper[4725]: I0225 11:06:05.959312 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rrwts"] Feb 25 11:06:06 crc kubenswrapper[4725]: I0225 11:06:06.669443 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" event={"ID":"7d7a9448-ae03-426a-8c08-5823c6097b8c","Type":"ContainerStarted","Data":"83659eb6372469b2a804bf92ab971be3348c0b770f2ca0f560fcd53b7753760f"} Feb 25 11:06:08 crc kubenswrapper[4725]: I0225 11:06:08.682026 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" event={"ID":"7d7a9448-ae03-426a-8c08-5823c6097b8c","Type":"ContainerStarted","Data":"c022f1be6a15435d1399cd8f49650ffe2b8e8d9e04d3725375c058bee2e70507"} Feb 25 11:06:08 crc kubenswrapper[4725]: I0225 11:06:08.702210 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-rrwts" podStartSLOduration=1.4715539149999999 podStartE2EDuration="3.702190365s" podCreationTimestamp="2026-02-25 11:06:05 +0000 UTC" firstStartedPulling="2026-02-25 11:06:05.971629343 +0000 UTC m=+791.470211368" lastFinishedPulling="2026-02-25 11:06:08.202265783 +0000 UTC m=+793.700847818" observedRunningTime="2026-02-25 11:06:08.700814559 +0000 UTC m=+794.199396594" watchObservedRunningTime="2026-02-25 11:06:08.702190365 +0000 UTC m=+794.200772400" Feb 25 11:06:11 crc kubenswrapper[4725]: I0225 11:06:11.556181 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:06:11 crc kubenswrapper[4725]: I0225 11:06:11.557736 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:06:15 crc kubenswrapper[4725]: I0225 11:06:15.020720 4725 scope.go:117] "RemoveContainer" containerID="8f90bc0e02696b80ad935bdfb0994b643f22eaa198cf6ab0a2bc43b5a0e2667d" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.064074 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.064880 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.067853 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rk582" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.082598 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.083058 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.084769 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.087732 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.096747 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-4xp96"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.097405 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.111684 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.205308 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.206078 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.208529 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.208882 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zhc9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.208893 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.214040 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.218650 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-dbus-socket\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.218694 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/93f80fd4-e221-4c81-ab48-77beb578add9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-sgrvv\" (UID: \"93f80fd4-e221-4c81-ab48-77beb578add9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.218714 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kqg4\" (UniqueName: \"kubernetes.io/projected/82055e0c-d941-42a4-a029-e58f3893b303-kube-api-access-9kqg4\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.218731 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-nmstate-lock\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.218752 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mdt\" (UniqueName: \"kubernetes.io/projected/93f80fd4-e221-4c81-ab48-77beb578add9-kube-api-access-c2mdt\") pod \"nmstate-webhook-866bcb46dc-sgrvv\" (UID: \"93f80fd4-e221-4c81-ab48-77beb578add9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.218797 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4ltm\" (UniqueName: \"kubernetes.io/projected/e0523051-56ca-4df3-ae89-488db2c9c37a-kube-api-access-x4ltm\") pod \"nmstate-metrics-58c85c668d-w2z2q\" (UID: \"e0523051-56ca-4df3-ae89-488db2c9c37a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.218818 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-ovs-socket\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319649 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1364b5-8725-454d-962e-a8c86ca27c2b-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319692 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w7nb\" (UniqueName: \"kubernetes.io/projected/0b1364b5-8725-454d-962e-a8c86ca27c2b-kube-api-access-9w7nb\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319719 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-dbus-socket\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319739 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0b1364b5-8725-454d-962e-a8c86ca27c2b-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319759 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/93f80fd4-e221-4c81-ab48-77beb578add9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-sgrvv\" (UID: \"93f80fd4-e221-4c81-ab48-77beb578add9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319775 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kqg4\" (UniqueName: \"kubernetes.io/projected/82055e0c-d941-42a4-a029-e58f3893b303-kube-api-access-9kqg4\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319792 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-nmstate-lock\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319812 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mdt\" (UniqueName: \"kubernetes.io/projected/93f80fd4-e221-4c81-ab48-77beb578add9-kube-api-access-c2mdt\") pod \"nmstate-webhook-866bcb46dc-sgrvv\" (UID: \"93f80fd4-e221-4c81-ab48-77beb578add9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319894 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4ltm\" (UniqueName: \"kubernetes.io/projected/e0523051-56ca-4df3-ae89-488db2c9c37a-kube-api-access-x4ltm\") pod \"nmstate-metrics-58c85c668d-w2z2q\" (UID: \"e0523051-56ca-4df3-ae89-488db2c9c37a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319917 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-ovs-socket\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.319980 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-ovs-socket\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.320273 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-dbus-socket\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: E0225 11:06:17.320321 4725 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 25 11:06:17 crc kubenswrapper[4725]: E0225 11:06:17.320411 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f80fd4-e221-4c81-ab48-77beb578add9-tls-key-pair podName:93f80fd4-e221-4c81-ab48-77beb578add9 nodeName:}" failed. No retries permitted until 2026-02-25 11:06:17.820391715 +0000 UTC m=+803.318973740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/93f80fd4-e221-4c81-ab48-77beb578add9-tls-key-pair") pod "nmstate-webhook-866bcb46dc-sgrvv" (UID: "93f80fd4-e221-4c81-ab48-77beb578add9") : secret "openshift-nmstate-webhook" not found Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.320574 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/82055e0c-d941-42a4-a029-e58f3893b303-nmstate-lock\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.338420 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kqg4\" (UniqueName: \"kubernetes.io/projected/82055e0c-d941-42a4-a029-e58f3893b303-kube-api-access-9kqg4\") pod \"nmstate-handler-4xp96\" (UID: \"82055e0c-d941-42a4-a029-e58f3893b303\") " pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.339093 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mdt\" (UniqueName: \"kubernetes.io/projected/93f80fd4-e221-4c81-ab48-77beb578add9-kube-api-access-c2mdt\") pod \"nmstate-webhook-866bcb46dc-sgrvv\" (UID: \"93f80fd4-e221-4c81-ab48-77beb578add9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.341502 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4ltm\" (UniqueName: \"kubernetes.io/projected/e0523051-56ca-4df3-ae89-488db2c9c37a-kube-api-access-x4ltm\") pod \"nmstate-metrics-58c85c668d-w2z2q\" (UID: \"e0523051-56ca-4df3-ae89-488db2c9c37a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.386426 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-848877996-zcd9l"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.387634 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.390192 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.397356 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-848877996-zcd9l"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.422157 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.423106 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1364b5-8725-454d-962e-a8c86ca27c2b-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.423233 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w7nb\" (UniqueName: \"kubernetes.io/projected/0b1364b5-8725-454d-962e-a8c86ca27c2b-kube-api-access-9w7nb\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.423306 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0b1364b5-8725-454d-962e-a8c86ca27c2b-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.424133 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0b1364b5-8725-454d-962e-a8c86ca27c2b-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.436366 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1364b5-8725-454d-962e-a8c86ca27c2b-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.454126 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w7nb\" (UniqueName: \"kubernetes.io/projected/0b1364b5-8725-454d-962e-a8c86ca27c2b-kube-api-access-9w7nb\") pod \"nmstate-console-plugin-5c78fc5d65-mk4rx\" (UID: \"0b1364b5-8725-454d-962e-a8c86ca27c2b\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.520350 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.527645 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-oauth-serving-cert\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.527722 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dz29\" (UniqueName: \"kubernetes.io/projected/562a86ba-23ff-431c-92b1-047797e72df7-kube-api-access-2dz29\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.527762 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/562a86ba-23ff-431c-92b1-047797e72df7-console-oauth-config\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.527799 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-trusted-ca-bundle\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.527904 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/562a86ba-23ff-431c-92b1-047797e72df7-console-serving-cert\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.528014 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-service-ca\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.528056 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-console-config\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.613044 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q"] Feb 25 11:06:17 crc kubenswrapper[4725]: W0225 11:06:17.621012 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0523051_56ca_4df3_ae89_488db2c9c37a.slice/crio-6065123ff5092a188bfe8fe3d5acb58026bd7bdd0910692c8dd1613da959e206 WatchSource:0}: Error finding container 6065123ff5092a188bfe8fe3d5acb58026bd7bdd0910692c8dd1613da959e206: Status 404 returned error can't find the container with id 6065123ff5092a188bfe8fe3d5acb58026bd7bdd0910692c8dd1613da959e206 Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.632882 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-service-ca\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.634470 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-console-config\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.635249 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-console-config\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.634393 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-service-ca\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.635286 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-oauth-serving-cert\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.635386 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dz29\" (UniqueName: \"kubernetes.io/projected/562a86ba-23ff-431c-92b1-047797e72df7-kube-api-access-2dz29\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.635456 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/562a86ba-23ff-431c-92b1-047797e72df7-console-oauth-config\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.635523 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-trusted-ca-bundle\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.635554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/562a86ba-23ff-431c-92b1-047797e72df7-console-serving-cert\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.636294 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-oauth-serving-cert\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.639024 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562a86ba-23ff-431c-92b1-047797e72df7-trusted-ca-bundle\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.644531 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/562a86ba-23ff-431c-92b1-047797e72df7-console-oauth-config\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.648870 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/562a86ba-23ff-431c-92b1-047797e72df7-console-serving-cert\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.655258 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dz29\" (UniqueName: \"kubernetes.io/projected/562a86ba-23ff-431c-92b1-047797e72df7-kube-api-access-2dz29\") pod \"console-848877996-zcd9l\" (UID: \"562a86ba-23ff-431c-92b1-047797e72df7\") " pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.702025 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.731175 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx"] Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.751511 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4xp96" event={"ID":"82055e0c-d941-42a4-a029-e58f3893b303","Type":"ContainerStarted","Data":"0a109a3bdcc0be6d9e6814ae3e0d774e06a003177604e3e1ea51628a5a9a6716"} Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.752658 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" event={"ID":"0b1364b5-8725-454d-962e-a8c86ca27c2b","Type":"ContainerStarted","Data":"1d8ad2447d537203428d112b76efbdf79e11c3c7d4546bac4659553a2977dfd9"} Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.753569 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" event={"ID":"e0523051-56ca-4df3-ae89-488db2c9c37a","Type":"ContainerStarted","Data":"6065123ff5092a188bfe8fe3d5acb58026bd7bdd0910692c8dd1613da959e206"} Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.840499 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/93f80fd4-e221-4c81-ab48-77beb578add9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-sgrvv\" (UID: \"93f80fd4-e221-4c81-ab48-77beb578add9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.845270 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/93f80fd4-e221-4c81-ab48-77beb578add9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-sgrvv\" (UID: \"93f80fd4-e221-4c81-ab48-77beb578add9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:17 crc kubenswrapper[4725]: I0225 11:06:17.876192 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-848877996-zcd9l"] Feb 25 11:06:17 crc kubenswrapper[4725]: W0225 11:06:17.885466 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562a86ba_23ff_431c_92b1_047797e72df7.slice/crio-1245140e9e16360c80f152d29cf2fc561eff28184c459f7787ac5677db7a49d1 WatchSource:0}: Error finding container 1245140e9e16360c80f152d29cf2fc561eff28184c459f7787ac5677db7a49d1: Status 404 returned error can't find the container with id 1245140e9e16360c80f152d29cf2fc561eff28184c459f7787ac5677db7a49d1 Feb 25 11:06:18 crc kubenswrapper[4725]: I0225 11:06:18.009248 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:18 crc kubenswrapper[4725]: I0225 11:06:18.489127 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv"] Feb 25 11:06:18 crc kubenswrapper[4725]: I0225 11:06:18.760803 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848877996-zcd9l" event={"ID":"562a86ba-23ff-431c-92b1-047797e72df7","Type":"ContainerStarted","Data":"43ad7e0da082e79a0976be87bf66b3e12018d5b502f9064de47b876ec90fb8d7"} Feb 25 11:06:18 crc kubenswrapper[4725]: I0225 11:06:18.760870 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848877996-zcd9l" event={"ID":"562a86ba-23ff-431c-92b1-047797e72df7","Type":"ContainerStarted","Data":"1245140e9e16360c80f152d29cf2fc561eff28184c459f7787ac5677db7a49d1"} Feb 25 11:06:18 crc kubenswrapper[4725]: I0225 11:06:18.761656 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" event={"ID":"93f80fd4-e221-4c81-ab48-77beb578add9","Type":"ContainerStarted","Data":"7aef93d270ff65373b37dd582acd31c5a1183a465b5cbd36384db07b5281fe40"} Feb 25 11:06:18 crc kubenswrapper[4725]: I0225 11:06:18.783897 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-848877996-zcd9l" podStartSLOduration=1.783878461 podStartE2EDuration="1.783878461s" podCreationTimestamp="2026-02-25 11:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:06:18.782671639 +0000 UTC m=+804.281253674" watchObservedRunningTime="2026-02-25 11:06:18.783878461 +0000 UTC m=+804.282460496" Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.776557 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" event={"ID":"e0523051-56ca-4df3-ae89-488db2c9c37a","Type":"ContainerStarted","Data":"ffe720af6b092152a5b4904584a466d75c58d1502a7f04fe3144e0505db0b10a"} Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.778868 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" event={"ID":"93f80fd4-e221-4c81-ab48-77beb578add9","Type":"ContainerStarted","Data":"db671a4465957ef54bb4070eb22f606de4aef915b145c161d6eb35c3dfcf2565"} Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.779020 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.781127 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4xp96" event={"ID":"82055e0c-d941-42a4-a029-e58f3893b303","Type":"ContainerStarted","Data":"d277708dbd8c39c1a417c9dee8277130f86b91e869823a42e1edf3f9d257835d"} Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.781305 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.783191 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" event={"ID":"0b1364b5-8725-454d-962e-a8c86ca27c2b","Type":"ContainerStarted","Data":"3f10a58df135280dc929f538c9209d90408f0ae4e7433061ff33e59bfa5cbc2c"} Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.801550 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" podStartSLOduration=1.995172736 podStartE2EDuration="3.801533531s" podCreationTimestamp="2026-02-25 11:06:17 +0000 UTC" firstStartedPulling="2026-02-25 11:06:18.508852394 +0000 UTC m=+804.007434419" lastFinishedPulling="2026-02-25 11:06:20.315213179 +0000 UTC m=+805.813795214" observedRunningTime="2026-02-25 11:06:20.801337636 +0000 UTC m=+806.299919701" watchObservedRunningTime="2026-02-25 11:06:20.801533531 +0000 UTC m=+806.300115566" Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.824012 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-4xp96" podStartSLOduration=0.977436964 podStartE2EDuration="3.823993398s" podCreationTimestamp="2026-02-25 11:06:17 +0000 UTC" firstStartedPulling="2026-02-25 11:06:17.465114701 +0000 UTC m=+802.963696726" lastFinishedPulling="2026-02-25 11:06:20.311671095 +0000 UTC m=+805.810253160" observedRunningTime="2026-02-25 11:06:20.819172 +0000 UTC m=+806.317754045" watchObservedRunningTime="2026-02-25 11:06:20.823993398 +0000 UTC m=+806.322575433" Feb 25 11:06:20 crc kubenswrapper[4725]: I0225 11:06:20.836173 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mk4rx" podStartSLOduration=1.262122528 podStartE2EDuration="3.836154131s" podCreationTimestamp="2026-02-25 11:06:17 +0000 UTC" firstStartedPulling="2026-02-25 11:06:17.739869631 +0000 UTC m=+803.238451666" lastFinishedPulling="2026-02-25 11:06:20.313901214 +0000 UTC m=+805.812483269" observedRunningTime="2026-02-25 11:06:20.833954552 +0000 UTC m=+806.332536617" watchObservedRunningTime="2026-02-25 11:06:20.836154131 +0000 UTC m=+806.334736156" Feb 25 11:06:23 crc kubenswrapper[4725]: I0225 11:06:23.806114 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" event={"ID":"e0523051-56ca-4df3-ae89-488db2c9c37a","Type":"ContainerStarted","Data":"8c6000bc5b35ef12a11b6dcece4ed06f268757247f4ae60cf3151ce3ca707dd2"} Feb 25 11:06:27 crc kubenswrapper[4725]: I0225 11:06:27.462785 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-4xp96" Feb 25 11:06:27 crc kubenswrapper[4725]: I0225 11:06:27.492617 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-w2z2q" podStartSLOduration=5.063605696 podStartE2EDuration="10.492550675s" podCreationTimestamp="2026-02-25 11:06:17 +0000 UTC" firstStartedPulling="2026-02-25 11:06:17.62315945 +0000 UTC m=+803.121741475" lastFinishedPulling="2026-02-25 11:06:23.052104389 +0000 UTC m=+808.550686454" observedRunningTime="2026-02-25 11:06:23.839717097 +0000 UTC m=+809.338299182" watchObservedRunningTime="2026-02-25 11:06:27.492550675 +0000 UTC m=+812.991132760" Feb 25 11:06:27 crc kubenswrapper[4725]: I0225 11:06:27.702988 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:27 crc kubenswrapper[4725]: I0225 11:06:27.703202 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:27 crc kubenswrapper[4725]: I0225 11:06:27.709645 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:27 crc kubenswrapper[4725]: I0225 11:06:27.838391 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-848877996-zcd9l" Feb 25 11:06:27 crc kubenswrapper[4725]: I0225 11:06:27.920486 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-f4l29"] Feb 25 11:06:38 crc kubenswrapper[4725]: I0225 11:06:38.018730 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-sgrvv" Feb 25 11:06:41 crc kubenswrapper[4725]: I0225 11:06:41.556184 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:06:41 crc kubenswrapper[4725]: I0225 11:06:41.556730 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:06:43 crc kubenswrapper[4725]: I0225 11:06:43.639324 4725 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 11:06:52 crc kubenswrapper[4725]: I0225 11:06:52.975659 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-f4l29" podUID="dcf8d8d2-144e-4232-bd68-b14a9f178c7d" containerName="console" containerID="cri-o://23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8" gracePeriod=15 Feb 25 11:06:53 crc kubenswrapper[4725]: I0225 11:06:53.933759 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-f4l29_dcf8d8d2-144e-4232-bd68-b14a9f178c7d/console/0.log" Feb 25 11:06:53 crc kubenswrapper[4725]: I0225 11:06:53.934079 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.027970 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-config\") pod \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028028 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-oauth-serving-cert\") pod \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028058 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-serving-cert\") pod \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028083 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-oauth-config\") pod \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028115 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-trusted-ca-bundle\") pod \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028204 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-service-ca\") pod \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028603 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-f4l29_dcf8d8d2-144e-4232-bd68-b14a9f178c7d/console/0.log" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028639 4725 generic.go:334] "Generic (PLEG): container finished" podID="dcf8d8d2-144e-4232-bd68-b14a9f178c7d" containerID="23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8" exitCode=2 Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028667 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f4l29" event={"ID":"dcf8d8d2-144e-4232-bd68-b14a9f178c7d","Type":"ContainerDied","Data":"23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8"} Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028694 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f4l29" event={"ID":"dcf8d8d2-144e-4232-bd68-b14a9f178c7d","Type":"ContainerDied","Data":"250227931cfe3391d1bc3d1691f6a51a264dd4e6b5fbc799f9b6d13d5c296409"} Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028710 4725 scope.go:117] "RemoveContainer" containerID="23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028732 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f4l29" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028947 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-service-ca" (OuterVolumeSpecName: "service-ca") pod "dcf8d8d2-144e-4232-bd68-b14a9f178c7d" (UID: "dcf8d8d2-144e-4232-bd68-b14a9f178c7d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.029198 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dcf8d8d2-144e-4232-bd68-b14a9f178c7d" (UID: "dcf8d8d2-144e-4232-bd68-b14a9f178c7d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.029612 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-config" (OuterVolumeSpecName: "console-config") pod "dcf8d8d2-144e-4232-bd68-b14a9f178c7d" (UID: "dcf8d8d2-144e-4232-bd68-b14a9f178c7d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.029681 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dcf8d8d2-144e-4232-bd68-b14a9f178c7d" (UID: "dcf8d8d2-144e-4232-bd68-b14a9f178c7d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.028240 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zkzx\" (UniqueName: \"kubernetes.io/projected/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-kube-api-access-9zkzx\") pod \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\" (UID: \"dcf8d8d2-144e-4232-bd68-b14a9f178c7d\") " Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.031276 4725 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.031295 4725 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.031304 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.031314 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.035004 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-kube-api-access-9zkzx" (OuterVolumeSpecName: "kube-api-access-9zkzx") pod "dcf8d8d2-144e-4232-bd68-b14a9f178c7d" (UID: "dcf8d8d2-144e-4232-bd68-b14a9f178c7d"). InnerVolumeSpecName "kube-api-access-9zkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.039187 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dcf8d8d2-144e-4232-bd68-b14a9f178c7d" (UID: "dcf8d8d2-144e-4232-bd68-b14a9f178c7d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.050230 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dcf8d8d2-144e-4232-bd68-b14a9f178c7d" (UID: "dcf8d8d2-144e-4232-bd68-b14a9f178c7d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.081808 4725 scope.go:117] "RemoveContainer" containerID="23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8" Feb 25 11:06:54 crc kubenswrapper[4725]: E0225 11:06:54.082349 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8\": container with ID starting with 23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8 not found: ID does not exist" containerID="23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.082404 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8"} err="failed to get container status \"23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8\": rpc error: code = NotFound desc = could not find container \"23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8\": container with ID starting with 23b62f5bbc18e078ae30fc4fb4f76126229c4f244b69d290318221c3182827f8 not found: ID does not exist" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.132973 4725 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.133015 4725 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.133029 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zkzx\" (UniqueName: \"kubernetes.io/projected/dcf8d8d2-144e-4232-bd68-b14a9f178c7d-kube-api-access-9zkzx\") on node \"crc\" DevicePath \"\"" Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.374199 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-f4l29"] Feb 25 11:06:54 crc kubenswrapper[4725]: I0225 11:06:54.382164 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-f4l29"] Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.242650 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf8d8d2-144e-4232-bd68-b14a9f178c7d" path="/var/lib/kubelet/pods/dcf8d8d2-144e-4232-bd68-b14a9f178c7d/volumes" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.292709 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw"] Feb 25 11:06:55 crc kubenswrapper[4725]: E0225 11:06:55.293236 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf8d8d2-144e-4232-bd68-b14a9f178c7d" containerName="console" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.293276 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf8d8d2-144e-4232-bd68-b14a9f178c7d" containerName="console" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.293539 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf8d8d2-144e-4232-bd68-b14a9f178c7d" containerName="console" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.295308 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.298667 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.299084 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw"] Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.351128 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.351211 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr277\" (UniqueName: \"kubernetes.io/projected/c5e3d2f9-7701-4ab5-a043-64fe366bc324-kube-api-access-hr277\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.351296 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.451740 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.451772 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr277\" (UniqueName: \"kubernetes.io/projected/c5e3d2f9-7701-4ab5-a043-64fe366bc324-kube-api-access-hr277\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.451814 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.452232 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.452255 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.482645 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr277\" (UniqueName: \"kubernetes.io/projected/c5e3d2f9-7701-4ab5-a043-64fe366bc324-kube-api-access-hr277\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:55 crc kubenswrapper[4725]: I0225 11:06:55.630227 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:06:56 crc kubenswrapper[4725]: I0225 11:06:56.139086 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw"] Feb 25 11:06:56 crc kubenswrapper[4725]: W0225 11:06:56.142665 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e3d2f9_7701_4ab5_a043_64fe366bc324.slice/crio-f7e21a966f91ee590a612aa32fc7d5f9790453e93f4dbe165290180d7acb7277 WatchSource:0}: Error finding container f7e21a966f91ee590a612aa32fc7d5f9790453e93f4dbe165290180d7acb7277: Status 404 returned error can't find the container with id f7e21a966f91ee590a612aa32fc7d5f9790453e93f4dbe165290180d7acb7277 Feb 25 11:06:57 crc kubenswrapper[4725]: I0225 11:06:57.057657 4725 generic.go:334] "Generic (PLEG): container finished" podID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerID="9243bf437b9617d5a60f2cf1166cba8e3afdba6df2f920546a34e9dbc591ea91" exitCode=0 Feb 25 11:06:57 crc kubenswrapper[4725]: I0225 11:06:57.058017 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" event={"ID":"c5e3d2f9-7701-4ab5-a043-64fe366bc324","Type":"ContainerDied","Data":"9243bf437b9617d5a60f2cf1166cba8e3afdba6df2f920546a34e9dbc591ea91"} Feb 25 11:06:57 crc kubenswrapper[4725]: I0225 11:06:57.058057 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" event={"ID":"c5e3d2f9-7701-4ab5-a043-64fe366bc324","Type":"ContainerStarted","Data":"f7e21a966f91ee590a612aa32fc7d5f9790453e93f4dbe165290180d7acb7277"} Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.619473 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nhprm"] Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.620767 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.645397 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhprm"] Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.695271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w2rz\" (UniqueName: \"kubernetes.io/projected/9a8b00ee-fd3d-4454-8869-36a5b9d05245-kube-api-access-9w2rz\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.695311 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-utilities\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.695335 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-catalog-content\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.797077 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w2rz\" (UniqueName: \"kubernetes.io/projected/9a8b00ee-fd3d-4454-8869-36a5b9d05245-kube-api-access-9w2rz\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.797426 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-utilities\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.797563 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-catalog-content\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.798242 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-utilities\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.798428 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-catalog-content\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.824504 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w2rz\" (UniqueName: \"kubernetes.io/projected/9a8b00ee-fd3d-4454-8869-36a5b9d05245-kube-api-access-9w2rz\") pod \"redhat-operators-nhprm\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:58 crc kubenswrapper[4725]: I0225 11:06:58.954096 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:06:59 crc kubenswrapper[4725]: I0225 11:06:59.076097 4725 generic.go:334] "Generic (PLEG): container finished" podID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerID="c8e5f858db17e9f5e79fb5e3574ab9f895681830ebc755e46991d4bfcee94c25" exitCode=0 Feb 25 11:06:59 crc kubenswrapper[4725]: I0225 11:06:59.076154 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" event={"ID":"c5e3d2f9-7701-4ab5-a043-64fe366bc324","Type":"ContainerDied","Data":"c8e5f858db17e9f5e79fb5e3574ab9f895681830ebc755e46991d4bfcee94c25"} Feb 25 11:06:59 crc kubenswrapper[4725]: I0225 11:06:59.421739 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhprm"] Feb 25 11:06:59 crc kubenswrapper[4725]: W0225 11:06:59.429709 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8b00ee_fd3d_4454_8869_36a5b9d05245.slice/crio-9aea14b9c3db64517262fb8774f2ed16da80714e13437a548e617c58f4ec38c8 WatchSource:0}: Error finding container 9aea14b9c3db64517262fb8774f2ed16da80714e13437a548e617c58f4ec38c8: Status 404 returned error can't find the container with id 9aea14b9c3db64517262fb8774f2ed16da80714e13437a548e617c58f4ec38c8 Feb 25 11:07:00 crc kubenswrapper[4725]: I0225 11:07:00.083456 4725 generic.go:334] "Generic (PLEG): container finished" podID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerID="c16a2edb5aedf9e6de88a6a658812e1e69f476f6ca5d719f83273e194c0a0ccc" exitCode=0 Feb 25 11:07:00 crc kubenswrapper[4725]: I0225 11:07:00.083537 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" event={"ID":"c5e3d2f9-7701-4ab5-a043-64fe366bc324","Type":"ContainerDied","Data":"c16a2edb5aedf9e6de88a6a658812e1e69f476f6ca5d719f83273e194c0a0ccc"} Feb 25 11:07:00 crc kubenswrapper[4725]: I0225 11:07:00.085085 4725 generic.go:334] "Generic (PLEG): container finished" podID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerID="ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e" exitCode=0 Feb 25 11:07:00 crc kubenswrapper[4725]: I0225 11:07:00.085137 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhprm" event={"ID":"9a8b00ee-fd3d-4454-8869-36a5b9d05245","Type":"ContainerDied","Data":"ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e"} Feb 25 11:07:00 crc kubenswrapper[4725]: I0225 11:07:00.085171 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhprm" event={"ID":"9a8b00ee-fd3d-4454-8869-36a5b9d05245","Type":"ContainerStarted","Data":"9aea14b9c3db64517262fb8774f2ed16da80714e13437a548e617c58f4ec38c8"} Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.099148 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhprm" event={"ID":"9a8b00ee-fd3d-4454-8869-36a5b9d05245","Type":"ContainerStarted","Data":"259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09"} Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.403425 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.437130 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr277\" (UniqueName: \"kubernetes.io/projected/c5e3d2f9-7701-4ab5-a043-64fe366bc324-kube-api-access-hr277\") pod \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.437196 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-bundle\") pod \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.437245 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-util\") pod \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\" (UID: \"c5e3d2f9-7701-4ab5-a043-64fe366bc324\") " Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.438151 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-bundle" (OuterVolumeSpecName: "bundle") pod "c5e3d2f9-7701-4ab5-a043-64fe366bc324" (UID: "c5e3d2f9-7701-4ab5-a043-64fe366bc324"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.446478 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e3d2f9-7701-4ab5-a043-64fe366bc324-kube-api-access-hr277" (OuterVolumeSpecName: "kube-api-access-hr277") pod "c5e3d2f9-7701-4ab5-a043-64fe366bc324" (UID: "c5e3d2f9-7701-4ab5-a043-64fe366bc324"). InnerVolumeSpecName "kube-api-access-hr277". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.458600 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-util" (OuterVolumeSpecName: "util") pod "c5e3d2f9-7701-4ab5-a043-64fe366bc324" (UID: "c5e3d2f9-7701-4ab5-a043-64fe366bc324"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.538757 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr277\" (UniqueName: \"kubernetes.io/projected/c5e3d2f9-7701-4ab5-a043-64fe366bc324-kube-api-access-hr277\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.538796 4725 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:01 crc kubenswrapper[4725]: I0225 11:07:01.538812 4725 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5e3d2f9-7701-4ab5-a043-64fe366bc324-util\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:02 crc kubenswrapper[4725]: I0225 11:07:02.110905 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" event={"ID":"c5e3d2f9-7701-4ab5-a043-64fe366bc324","Type":"ContainerDied","Data":"f7e21a966f91ee590a612aa32fc7d5f9790453e93f4dbe165290180d7acb7277"} Feb 25 11:07:02 crc kubenswrapper[4725]: I0225 11:07:02.110963 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw" Feb 25 11:07:02 crc kubenswrapper[4725]: I0225 11:07:02.110966 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7e21a966f91ee590a612aa32fc7d5f9790453e93f4dbe165290180d7acb7277" Feb 25 11:07:02 crc kubenswrapper[4725]: I0225 11:07:02.117588 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhprm" event={"ID":"9a8b00ee-fd3d-4454-8869-36a5b9d05245","Type":"ContainerDied","Data":"259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09"} Feb 25 11:07:02 crc kubenswrapper[4725]: I0225 11:07:02.117461 4725 generic.go:334] "Generic (PLEG): container finished" podID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerID="259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09" exitCode=0 Feb 25 11:07:02 crc kubenswrapper[4725]: I0225 11:07:02.120675 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:07:03 crc kubenswrapper[4725]: I0225 11:07:03.124895 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhprm" event={"ID":"9a8b00ee-fd3d-4454-8869-36a5b9d05245","Type":"ContainerStarted","Data":"49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3"} Feb 25 11:07:03 crc kubenswrapper[4725]: I0225 11:07:03.143237 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nhprm" podStartSLOduration=2.692047131 podStartE2EDuration="5.143216149s" podCreationTimestamp="2026-02-25 11:06:58 +0000 UTC" firstStartedPulling="2026-02-25 11:07:00.0864798 +0000 UTC m=+845.585061835" lastFinishedPulling="2026-02-25 11:07:02.537648788 +0000 UTC m=+848.036230853" observedRunningTime="2026-02-25 11:07:03.13836729 +0000 UTC m=+848.636949345" watchObservedRunningTime="2026-02-25 11:07:03.143216149 +0000 UTC m=+848.641798194" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.876639 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr"] Feb 25 11:07:08 crc kubenswrapper[4725]: E0225 11:07:08.877761 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerName="util" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.877776 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerName="util" Feb 25 11:07:08 crc kubenswrapper[4725]: E0225 11:07:08.877806 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerName="pull" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.877814 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerName="pull" Feb 25 11:07:08 crc kubenswrapper[4725]: E0225 11:07:08.877846 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerName="extract" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.877855 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerName="extract" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.878607 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e3d2f9-7701-4ab5-a043-64fe366bc324" containerName="extract" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.879780 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.895969 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.896284 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.896410 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.896869 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.897632 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rgsfl" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.912847 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr"] Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.928910 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14923832-70ad-4019-b795-4094d767dfda-apiservice-cert\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.929208 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nk7h\" (UniqueName: \"kubernetes.io/projected/14923832-70ad-4019-b795-4094d767dfda-kube-api-access-4nk7h\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.929430 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14923832-70ad-4019-b795-4094d767dfda-webhook-cert\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.954645 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:07:08 crc kubenswrapper[4725]: I0225 11:07:08.954884 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.030992 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14923832-70ad-4019-b795-4094d767dfda-apiservice-cert\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.031255 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nk7h\" (UniqueName: \"kubernetes.io/projected/14923832-70ad-4019-b795-4094d767dfda-kube-api-access-4nk7h\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.031355 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14923832-70ad-4019-b795-4094d767dfda-webhook-cert\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.037208 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14923832-70ad-4019-b795-4094d767dfda-webhook-cert\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.037250 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14923832-70ad-4019-b795-4094d767dfda-apiservice-cert\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.067644 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nk7h\" (UniqueName: \"kubernetes.io/projected/14923832-70ad-4019-b795-4094d767dfda-kube-api-access-4nk7h\") pod \"metallb-operator-controller-manager-768ffd8bd5-q5ktr\" (UID: \"14923832-70ad-4019-b795-4094d767dfda\") " pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.215197 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.249277 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm"] Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.250119 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.256621 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.257057 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.257867 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-sqhhj" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.275993 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm"] Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.334327 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50bf643b-abcd-4134-bfd5-a08256ad5652-webhook-cert\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.334599 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50bf643b-abcd-4134-bfd5-a08256ad5652-apiservice-cert\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.334738 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwq4c\" (UniqueName: \"kubernetes.io/projected/50bf643b-abcd-4134-bfd5-a08256ad5652-kube-api-access-wwq4c\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.435603 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50bf643b-abcd-4134-bfd5-a08256ad5652-apiservice-cert\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.435679 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwq4c\" (UniqueName: \"kubernetes.io/projected/50bf643b-abcd-4134-bfd5-a08256ad5652-kube-api-access-wwq4c\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.435714 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50bf643b-abcd-4134-bfd5-a08256ad5652-webhook-cert\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.439198 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50bf643b-abcd-4134-bfd5-a08256ad5652-webhook-cert\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.461333 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50bf643b-abcd-4134-bfd5-a08256ad5652-apiservice-cert\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.496955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwq4c\" (UniqueName: \"kubernetes.io/projected/50bf643b-abcd-4134-bfd5-a08256ad5652-kube-api-access-wwq4c\") pod \"metallb-operator-webhook-server-56448fcbcf-jpqnm\" (UID: \"50bf643b-abcd-4134-bfd5-a08256ad5652\") " pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.587044 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.782005 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr"] Feb 25 11:07:09 crc kubenswrapper[4725]: W0225 11:07:09.786514 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14923832_70ad_4019_b795_4094d767dfda.slice/crio-f0cef4c13510292d29587a12c5d3822b8b6bf8ae0f2576f362b8e028bc6ca423 WatchSource:0}: Error finding container f0cef4c13510292d29587a12c5d3822b8b6bf8ae0f2576f362b8e028bc6ca423: Status 404 returned error can't find the container with id f0cef4c13510292d29587a12c5d3822b8b6bf8ae0f2576f362b8e028bc6ca423 Feb 25 11:07:09 crc kubenswrapper[4725]: I0225 11:07:09.866013 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm"] Feb 25 11:07:09 crc kubenswrapper[4725]: W0225 11:07:09.867575 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50bf643b_abcd_4134_bfd5_a08256ad5652.slice/crio-dc44fece49b6c8396642691ca9f85590175b3b36e7a67298d9d8e57c42b8ffbb WatchSource:0}: Error finding container dc44fece49b6c8396642691ca9f85590175b3b36e7a67298d9d8e57c42b8ffbb: Status 404 returned error can't find the container with id dc44fece49b6c8396642691ca9f85590175b3b36e7a67298d9d8e57c42b8ffbb Feb 25 11:07:10 crc kubenswrapper[4725]: I0225 11:07:10.001201 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhprm" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="registry-server" probeResult="failure" output=< Feb 25 11:07:10 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 25 11:07:10 crc kubenswrapper[4725]: > Feb 25 11:07:10 crc kubenswrapper[4725]: I0225 11:07:10.171900 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" event={"ID":"14923832-70ad-4019-b795-4094d767dfda","Type":"ContainerStarted","Data":"f0cef4c13510292d29587a12c5d3822b8b6bf8ae0f2576f362b8e028bc6ca423"} Feb 25 11:07:10 crc kubenswrapper[4725]: I0225 11:07:10.173269 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" event={"ID":"50bf643b-abcd-4134-bfd5-a08256ad5652","Type":"ContainerStarted","Data":"dc44fece49b6c8396642691ca9f85590175b3b36e7a67298d9d8e57c42b8ffbb"} Feb 25 11:07:11 crc kubenswrapper[4725]: I0225 11:07:11.555945 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:07:11 crc kubenswrapper[4725]: I0225 11:07:11.556193 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:07:11 crc kubenswrapper[4725]: I0225 11:07:11.556252 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:07:11 crc kubenswrapper[4725]: I0225 11:07:11.556962 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"976e63b74d2c07989af044494938e1fa71027bc94145eac91a1d7ca390924f15"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:07:11 crc kubenswrapper[4725]: I0225 11:07:11.557032 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://976e63b74d2c07989af044494938e1fa71027bc94145eac91a1d7ca390924f15" gracePeriod=600 Feb 25 11:07:12 crc kubenswrapper[4725]: I0225 11:07:12.187279 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="976e63b74d2c07989af044494938e1fa71027bc94145eac91a1d7ca390924f15" exitCode=0 Feb 25 11:07:12 crc kubenswrapper[4725]: I0225 11:07:12.187356 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"976e63b74d2c07989af044494938e1fa71027bc94145eac91a1d7ca390924f15"} Feb 25 11:07:12 crc kubenswrapper[4725]: I0225 11:07:12.187566 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"7caa77cf5b27b9b598253176495f0fa2415fb90743494a0dd02b8750c84c33d8"} Feb 25 11:07:12 crc kubenswrapper[4725]: I0225 11:07:12.187587 4725 scope.go:117] "RemoveContainer" containerID="08647c57662156eda0794f315db9e612765b561e546985866febc0fd340a1ac9" Feb 25 11:07:15 crc kubenswrapper[4725]: I0225 11:07:15.215567 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" event={"ID":"14923832-70ad-4019-b795-4094d767dfda","Type":"ContainerStarted","Data":"b0de6fb76b6a8652aae386c715b08257dc4b79345d5b339e089f3628e3412b71"} Feb 25 11:07:15 crc kubenswrapper[4725]: I0225 11:07:15.216151 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:15 crc kubenswrapper[4725]: I0225 11:07:15.218146 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" event={"ID":"50bf643b-abcd-4134-bfd5-a08256ad5652","Type":"ContainerStarted","Data":"a0af8d28d14dd990693d40daa1e7834e78aa3d8747e1331fd08582a111b9d73d"} Feb 25 11:07:15 crc kubenswrapper[4725]: I0225 11:07:15.218633 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:15 crc kubenswrapper[4725]: I0225 11:07:15.254582 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" podStartSLOduration=2.281067025 podStartE2EDuration="7.254563274s" podCreationTimestamp="2026-02-25 11:07:08 +0000 UTC" firstStartedPulling="2026-02-25 11:07:09.791913218 +0000 UTC m=+855.290495243" lastFinishedPulling="2026-02-25 11:07:14.765409467 +0000 UTC m=+860.263991492" observedRunningTime="2026-02-25 11:07:15.250719222 +0000 UTC m=+860.749301267" watchObservedRunningTime="2026-02-25 11:07:15.254563274 +0000 UTC m=+860.753145309" Feb 25 11:07:15 crc kubenswrapper[4725]: I0225 11:07:15.284641 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" podStartSLOduration=1.376519272 podStartE2EDuration="6.284621243s" podCreationTimestamp="2026-02-25 11:07:09 +0000 UTC" firstStartedPulling="2026-02-25 11:07:09.875943761 +0000 UTC m=+855.374525786" lastFinishedPulling="2026-02-25 11:07:14.784045732 +0000 UTC m=+860.282627757" observedRunningTime="2026-02-25 11:07:15.283263267 +0000 UTC m=+860.781845292" watchObservedRunningTime="2026-02-25 11:07:15.284621243 +0000 UTC m=+860.783203288" Feb 25 11:07:19 crc kubenswrapper[4725]: I0225 11:07:19.010151 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:07:19 crc kubenswrapper[4725]: I0225 11:07:19.062938 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:07:19 crc kubenswrapper[4725]: I0225 11:07:19.245433 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhprm"] Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.255726 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nhprm" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="registry-server" containerID="cri-o://49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3" gracePeriod=2 Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.739276 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.818094 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w2rz\" (UniqueName: \"kubernetes.io/projected/9a8b00ee-fd3d-4454-8869-36a5b9d05245-kube-api-access-9w2rz\") pod \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.819059 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-catalog-content\") pod \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.819097 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-utilities\") pod \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\" (UID: \"9a8b00ee-fd3d-4454-8869-36a5b9d05245\") " Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.819922 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-utilities" (OuterVolumeSpecName: "utilities") pod "9a8b00ee-fd3d-4454-8869-36a5b9d05245" (UID: "9a8b00ee-fd3d-4454-8869-36a5b9d05245"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.823877 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a8b00ee-fd3d-4454-8869-36a5b9d05245-kube-api-access-9w2rz" (OuterVolumeSpecName: "kube-api-access-9w2rz") pod "9a8b00ee-fd3d-4454-8869-36a5b9d05245" (UID: "9a8b00ee-fd3d-4454-8869-36a5b9d05245"). InnerVolumeSpecName "kube-api-access-9w2rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.919994 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w2rz\" (UniqueName: \"kubernetes.io/projected/9a8b00ee-fd3d-4454-8869-36a5b9d05245-kube-api-access-9w2rz\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.920326 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:20 crc kubenswrapper[4725]: I0225 11:07:20.971737 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a8b00ee-fd3d-4454-8869-36a5b9d05245" (UID: "9a8b00ee-fd3d-4454-8869-36a5b9d05245"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.021872 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8b00ee-fd3d-4454-8869-36a5b9d05245-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.265738 4725 generic.go:334] "Generic (PLEG): container finished" podID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerID="49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3" exitCode=0 Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.265810 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhprm" event={"ID":"9a8b00ee-fd3d-4454-8869-36a5b9d05245","Type":"ContainerDied","Data":"49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3"} Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.265891 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhprm" event={"ID":"9a8b00ee-fd3d-4454-8869-36a5b9d05245","Type":"ContainerDied","Data":"9aea14b9c3db64517262fb8774f2ed16da80714e13437a548e617c58f4ec38c8"} Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.265929 4725 scope.go:117] "RemoveContainer" containerID="49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.265932 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhprm" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.295371 4725 scope.go:117] "RemoveContainer" containerID="259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.299335 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhprm"] Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.306600 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nhprm"] Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.334180 4725 scope.go:117] "RemoveContainer" containerID="ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.351891 4725 scope.go:117] "RemoveContainer" containerID="49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3" Feb 25 11:07:21 crc kubenswrapper[4725]: E0225 11:07:21.352798 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3\": container with ID starting with 49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3 not found: ID does not exist" containerID="49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.352908 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3"} err="failed to get container status \"49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3\": rpc error: code = NotFound desc = could not find container \"49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3\": container with ID starting with 49bbee785cde1743433c1a90bea56165d523204187230f1d320af7c41a29e6f3 not found: ID does not exist" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.352968 4725 scope.go:117] "RemoveContainer" containerID="259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09" Feb 25 11:07:21 crc kubenswrapper[4725]: E0225 11:07:21.353397 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09\": container with ID starting with 259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09 not found: ID does not exist" containerID="259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.353424 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09"} err="failed to get container status \"259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09\": rpc error: code = NotFound desc = could not find container \"259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09\": container with ID starting with 259ba54f8369ac1d89d9a17b8e09379f1dab1a5f3aa17a736abad08b5b5cda09 not found: ID does not exist" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.353467 4725 scope.go:117] "RemoveContainer" containerID="ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e" Feb 25 11:07:21 crc kubenswrapper[4725]: E0225 11:07:21.353781 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e\": container with ID starting with ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e not found: ID does not exist" containerID="ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e" Feb 25 11:07:21 crc kubenswrapper[4725]: I0225 11:07:21.353808 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e"} err="failed to get container status \"ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e\": rpc error: code = NotFound desc = could not find container \"ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e\": container with ID starting with ea9c7b4fb80c80af6168395fc69fb337253b71c1053f8b3bb66dfa282505ef3e not found: ID does not exist" Feb 25 11:07:23 crc kubenswrapper[4725]: I0225 11:07:23.235902 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" path="/var/lib/kubelet/pods/9a8b00ee-fd3d-4454-8869-36a5b9d05245/volumes" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.654331 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hz7km"] Feb 25 11:07:25 crc kubenswrapper[4725]: E0225 11:07:25.654905 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="extract-utilities" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.654922 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="extract-utilities" Feb 25 11:07:25 crc kubenswrapper[4725]: E0225 11:07:25.654933 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="extract-content" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.654943 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="extract-content" Feb 25 11:07:25 crc kubenswrapper[4725]: E0225 11:07:25.654960 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="registry-server" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.654967 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="registry-server" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.655101 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8b00ee-fd3d-4454-8869-36a5b9d05245" containerName="registry-server" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.655874 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.669623 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hz7km"] Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.780849 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wjht\" (UniqueName: \"kubernetes.io/projected/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-kube-api-access-8wjht\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.780935 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-utilities\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.780971 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-catalog-content\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.882297 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-utilities\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.882345 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-catalog-content\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.882402 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wjht\" (UniqueName: \"kubernetes.io/projected/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-kube-api-access-8wjht\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.882912 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-utilities\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.882956 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-catalog-content\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.901752 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wjht\" (UniqueName: \"kubernetes.io/projected/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-kube-api-access-8wjht\") pod \"community-operators-hz7km\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:25 crc kubenswrapper[4725]: I0225 11:07:25.974863 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:26 crc kubenswrapper[4725]: I0225 11:07:26.473120 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hz7km"] Feb 25 11:07:26 crc kubenswrapper[4725]: W0225 11:07:26.497040 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3b50e20_dbb2_4d6a_afe2_c82a4a353b25.slice/crio-271a3df3251f1cf89a2d20dab32a3e726bc490ad4e7208a87690c63582744cf7 WatchSource:0}: Error finding container 271a3df3251f1cf89a2d20dab32a3e726bc490ad4e7208a87690c63582744cf7: Status 404 returned error can't find the container with id 271a3df3251f1cf89a2d20dab32a3e726bc490ad4e7208a87690c63582744cf7 Feb 25 11:07:27 crc kubenswrapper[4725]: I0225 11:07:27.313053 4725 generic.go:334] "Generic (PLEG): container finished" podID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerID="8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9" exitCode=0 Feb 25 11:07:27 crc kubenswrapper[4725]: I0225 11:07:27.313149 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz7km" event={"ID":"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25","Type":"ContainerDied","Data":"8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9"} Feb 25 11:07:27 crc kubenswrapper[4725]: I0225 11:07:27.313359 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz7km" event={"ID":"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25","Type":"ContainerStarted","Data":"271a3df3251f1cf89a2d20dab32a3e726bc490ad4e7208a87690c63582744cf7"} Feb 25 11:07:28 crc kubenswrapper[4725]: I0225 11:07:28.320264 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz7km" event={"ID":"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25","Type":"ContainerStarted","Data":"73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229"} Feb 25 11:07:29 crc kubenswrapper[4725]: I0225 11:07:29.326257 4725 generic.go:334] "Generic (PLEG): container finished" podID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerID="73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229" exitCode=0 Feb 25 11:07:29 crc kubenswrapper[4725]: I0225 11:07:29.326310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz7km" event={"ID":"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25","Type":"ContainerDied","Data":"73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229"} Feb 25 11:07:29 crc kubenswrapper[4725]: I0225 11:07:29.593479 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-56448fcbcf-jpqnm" Feb 25 11:07:30 crc kubenswrapper[4725]: I0225 11:07:30.336581 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz7km" event={"ID":"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25","Type":"ContainerStarted","Data":"b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244"} Feb 25 11:07:30 crc kubenswrapper[4725]: I0225 11:07:30.360502 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hz7km" podStartSLOduration=2.935540984 podStartE2EDuration="5.360474636s" podCreationTimestamp="2026-02-25 11:07:25 +0000 UTC" firstStartedPulling="2026-02-25 11:07:27.315083428 +0000 UTC m=+872.813665463" lastFinishedPulling="2026-02-25 11:07:29.74001706 +0000 UTC m=+875.238599115" observedRunningTime="2026-02-25 11:07:30.358552815 +0000 UTC m=+875.857134860" watchObservedRunningTime="2026-02-25 11:07:30.360474636 +0000 UTC m=+875.859056711" Feb 25 11:07:35 crc kubenswrapper[4725]: I0225 11:07:35.975529 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:35 crc kubenswrapper[4725]: I0225 11:07:35.976082 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:36 crc kubenswrapper[4725]: I0225 11:07:36.014923 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:36 crc kubenswrapper[4725]: I0225 11:07:36.465462 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.053471 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hz7km"] Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.400753 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hz7km" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="registry-server" containerID="cri-o://b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244" gracePeriod=2 Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.727632 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.769929 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wjht\" (UniqueName: \"kubernetes.io/projected/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-kube-api-access-8wjht\") pod \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.769964 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-utilities\") pod \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.769980 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-catalog-content\") pod \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\" (UID: \"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25\") " Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.770988 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-utilities" (OuterVolumeSpecName: "utilities") pod "b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" (UID: "b3b50e20-dbb2-4d6a-afe2-c82a4a353b25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.788434 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-kube-api-access-8wjht" (OuterVolumeSpecName: "kube-api-access-8wjht") pod "b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" (UID: "b3b50e20-dbb2-4d6a-afe2-c82a4a353b25"). InnerVolumeSpecName "kube-api-access-8wjht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.850213 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" (UID: "b3b50e20-dbb2-4d6a-afe2-c82a4a353b25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.871253 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wjht\" (UniqueName: \"kubernetes.io/projected/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-kube-api-access-8wjht\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.871281 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:38 crc kubenswrapper[4725]: I0225 11:07:38.871291 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.411261 4725 generic.go:334] "Generic (PLEG): container finished" podID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerID="b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244" exitCode=0 Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.411314 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz7km" event={"ID":"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25","Type":"ContainerDied","Data":"b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244"} Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.412080 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hz7km" event={"ID":"b3b50e20-dbb2-4d6a-afe2-c82a4a353b25","Type":"ContainerDied","Data":"271a3df3251f1cf89a2d20dab32a3e726bc490ad4e7208a87690c63582744cf7"} Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.412113 4725 scope.go:117] "RemoveContainer" containerID="b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.411362 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hz7km" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.438407 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hz7km"] Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.446740 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hz7km"] Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.447764 4725 scope.go:117] "RemoveContainer" containerID="73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.480569 4725 scope.go:117] "RemoveContainer" containerID="8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.518034 4725 scope.go:117] "RemoveContainer" containerID="b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244" Feb 25 11:07:39 crc kubenswrapper[4725]: E0225 11:07:39.518636 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244\": container with ID starting with b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244 not found: ID does not exist" containerID="b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.518690 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244"} err="failed to get container status \"b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244\": rpc error: code = NotFound desc = could not find container \"b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244\": container with ID starting with b1114f389e7c9d253849b64793be16e68ee2db3f4f298bb39f51b9b4251d2244 not found: ID does not exist" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.518726 4725 scope.go:117] "RemoveContainer" containerID="73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229" Feb 25 11:07:39 crc kubenswrapper[4725]: E0225 11:07:39.519325 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229\": container with ID starting with 73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229 not found: ID does not exist" containerID="73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.519366 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229"} err="failed to get container status \"73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229\": rpc error: code = NotFound desc = could not find container \"73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229\": container with ID starting with 73823467660d1d9342300a04469746369d80dff73ef6a548efed35cb88cc8229 not found: ID does not exist" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.519394 4725 scope.go:117] "RemoveContainer" containerID="8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9" Feb 25 11:07:39 crc kubenswrapper[4725]: E0225 11:07:39.520400 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9\": container with ID starting with 8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9 not found: ID does not exist" containerID="8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9" Feb 25 11:07:39 crc kubenswrapper[4725]: I0225 11:07:39.520454 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9"} err="failed to get container status \"8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9\": rpc error: code = NotFound desc = could not find container \"8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9\": container with ID starting with 8c66ab9177f48d5cb9bafd50bc96d4d3347aa01955ca7ab29b3335e0cfa6a2c9 not found: ID does not exist" Feb 25 11:07:41 crc kubenswrapper[4725]: I0225 11:07:41.240126 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" path="/var/lib/kubelet/pods/b3b50e20-dbb2-4d6a-afe2-c82a4a353b25/volumes" Feb 25 11:07:49 crc kubenswrapper[4725]: I0225 11:07:49.218482 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-768ffd8bd5-q5ktr" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.388526 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hjqx6"] Feb 25 11:07:50 crc kubenswrapper[4725]: E0225 11:07:50.388793 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="extract-utilities" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.388807 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="extract-utilities" Feb 25 11:07:50 crc kubenswrapper[4725]: E0225 11:07:50.388849 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="registry-server" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.388858 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="registry-server" Feb 25 11:07:50 crc kubenswrapper[4725]: E0225 11:07:50.388881 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="extract-content" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.388890 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="extract-content" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.389019 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b50e20-dbb2-4d6a-afe2-c82a4a353b25" containerName="registry-server" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.391473 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.398693 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl"] Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.399424 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.400116 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.400235 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.400427 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-z296v" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.402655 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.414087 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl"] Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439417 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q45j\" (UniqueName: \"kubernetes.io/projected/bcc5161a-6f59-4878-a2ae-5f4a533021c3-kube-api-access-7q45j\") pod \"frr-k8s-webhook-server-78b44bf5bb-xn4fl\" (UID: \"bcc5161a-6f59-4878-a2ae-5f4a533021c3\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439469 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-frr-sockets\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439544 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-frr-conf\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439613 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ced2390-9bb3-44f1-a851-994322d83bff-frr-startup\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439681 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-reloader\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439761 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8mdt\" (UniqueName: \"kubernetes.io/projected/0ced2390-9bb3-44f1-a851-994322d83bff-kube-api-access-r8mdt\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439793 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ced2390-9bb3-44f1-a851-994322d83bff-metrics-certs\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439854 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-metrics\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.439921 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc5161a-6f59-4878-a2ae-5f4a533021c3-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-xn4fl\" (UID: \"bcc5161a-6f59-4878-a2ae-5f4a533021c3\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.486133 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-svwnh"] Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.487264 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.490657 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t4h2j" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.491067 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.491253 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.491424 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.495364 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-67xsd"] Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.496425 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.500150 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.513022 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-67xsd"] Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.540839 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-metrics\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541113 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541242 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc5161a-6f59-4878-a2ae-5f4a533021c3-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-xn4fl\" (UID: \"bcc5161a-6f59-4878-a2ae-5f4a533021c3\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541338 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q45j\" (UniqueName: \"kubernetes.io/projected/bcc5161a-6f59-4878-a2ae-5f4a533021c3-kube-api-access-7q45j\") pod \"frr-k8s-webhook-server-78b44bf5bb-xn4fl\" (UID: \"bcc5161a-6f59-4878-a2ae-5f4a533021c3\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541432 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-frr-sockets\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541519 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/beb65949-bc67-4f16-892c-8979cc412e9e-cert\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541636 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-frr-conf\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541740 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2262\" (UniqueName: \"kubernetes.io/projected/beb65949-bc67-4f16-892c-8979cc412e9e-kube-api-access-c2262\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541877 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ced2390-9bb3-44f1-a851-994322d83bff-frr-startup\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.541973 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beb65949-bc67-4f16-892c-8979cc412e9e-metrics-certs\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542085 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-metrics-certs\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542189 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-reloader\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542244 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-frr-conf\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542287 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7p7\" (UniqueName: \"kubernetes.io/projected/b577777e-718a-4f09-a76a-98aa4f068184-kube-api-access-5m7p7\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542393 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b577777e-718a-4f09-a76a-98aa4f068184-metallb-excludel2\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542420 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8mdt\" (UniqueName: \"kubernetes.io/projected/0ced2390-9bb3-44f1-a851-994322d83bff-kube-api-access-r8mdt\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542442 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ced2390-9bb3-44f1-a851-994322d83bff-metrics-certs\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542292 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-frr-sockets\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542546 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-reloader\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.542334 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ced2390-9bb3-44f1-a851-994322d83bff-metrics\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.543076 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ced2390-9bb3-44f1-a851-994322d83bff-frr-startup\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.548519 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ced2390-9bb3-44f1-a851-994322d83bff-metrics-certs\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.549962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc5161a-6f59-4878-a2ae-5f4a533021c3-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-xn4fl\" (UID: \"bcc5161a-6f59-4878-a2ae-5f4a533021c3\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.558671 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q45j\" (UniqueName: \"kubernetes.io/projected/bcc5161a-6f59-4878-a2ae-5f4a533021c3-kube-api-access-7q45j\") pod \"frr-k8s-webhook-server-78b44bf5bb-xn4fl\" (UID: \"bcc5161a-6f59-4878-a2ae-5f4a533021c3\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.562058 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8mdt\" (UniqueName: \"kubernetes.io/projected/0ced2390-9bb3-44f1-a851-994322d83bff-kube-api-access-r8mdt\") pod \"frr-k8s-hjqx6\" (UID: \"0ced2390-9bb3-44f1-a851-994322d83bff\") " pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.643821 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2262\" (UniqueName: \"kubernetes.io/projected/beb65949-bc67-4f16-892c-8979cc412e9e-kube-api-access-c2262\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.643873 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beb65949-bc67-4f16-892c-8979cc412e9e-metrics-certs\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.643891 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-metrics-certs\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.643914 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7p7\" (UniqueName: \"kubernetes.io/projected/b577777e-718a-4f09-a76a-98aa4f068184-kube-api-access-5m7p7\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.643941 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b577777e-718a-4f09-a76a-98aa4f068184-metallb-excludel2\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.643982 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.644006 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/beb65949-bc67-4f16-892c-8979cc412e9e-cert\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: E0225 11:07:50.644124 4725 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 25 11:07:50 crc kubenswrapper[4725]: E0225 11:07:50.644214 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-metrics-certs podName:b577777e-718a-4f09-a76a-98aa4f068184 nodeName:}" failed. No retries permitted until 2026-02-25 11:07:51.144188212 +0000 UTC m=+896.642770237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-metrics-certs") pod "speaker-svwnh" (UID: "b577777e-718a-4f09-a76a-98aa4f068184") : secret "speaker-certs-secret" not found Feb 25 11:07:50 crc kubenswrapper[4725]: E0225 11:07:50.644304 4725 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 25 11:07:50 crc kubenswrapper[4725]: E0225 11:07:50.644359 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist podName:b577777e-718a-4f09-a76a-98aa4f068184 nodeName:}" failed. No retries permitted until 2026-02-25 11:07:51.144344176 +0000 UTC m=+896.642926191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist") pod "speaker-svwnh" (UID: "b577777e-718a-4f09-a76a-98aa4f068184") : secret "metallb-memberlist" not found Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.644727 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b577777e-718a-4f09-a76a-98aa4f068184-metallb-excludel2\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.647329 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/beb65949-bc67-4f16-892c-8979cc412e9e-cert\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.647750 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beb65949-bc67-4f16-892c-8979cc412e9e-metrics-certs\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.661642 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2262\" (UniqueName: \"kubernetes.io/projected/beb65949-bc67-4f16-892c-8979cc412e9e-kube-api-access-c2262\") pod \"controller-69bbfbf88f-67xsd\" (UID: \"beb65949-bc67-4f16-892c-8979cc412e9e\") " pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.672645 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7p7\" (UniqueName: \"kubernetes.io/projected/b577777e-718a-4f09-a76a-98aa4f068184-kube-api-access-5m7p7\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.711346 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.722750 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.810619 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:50 crc kubenswrapper[4725]: I0225 11:07:50.910387 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl"] Feb 25 11:07:50 crc kubenswrapper[4725]: W0225 11:07:50.916870 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcc5161a_6f59_4878_a2ae_5f4a533021c3.slice/crio-436b99c533630bca41c7000cf75dcf81569a4dfd74983e70a60984986292c9fd WatchSource:0}: Error finding container 436b99c533630bca41c7000cf75dcf81569a4dfd74983e70a60984986292c9fd: Status 404 returned error can't find the container with id 436b99c533630bca41c7000cf75dcf81569a4dfd74983e70a60984986292c9fd Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:50.999320 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-67xsd"] Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.151343 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.151413 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-metrics-certs\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:51 crc kubenswrapper[4725]: E0225 11:07:51.151506 4725 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 25 11:07:51 crc kubenswrapper[4725]: E0225 11:07:51.151592 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist podName:b577777e-718a-4f09-a76a-98aa4f068184 nodeName:}" failed. No retries permitted until 2026-02-25 11:07:52.151570234 +0000 UTC m=+897.650152259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist") pod "speaker-svwnh" (UID: "b577777e-718a-4f09-a76a-98aa4f068184") : secret "metallb-memberlist" not found Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.156557 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-metrics-certs\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.495726 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-67xsd" event={"ID":"beb65949-bc67-4f16-892c-8979cc412e9e","Type":"ContainerStarted","Data":"9452be038ad47e3e35a64421cbb2965d56408897946e858cbbca76dcec610676"} Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.496119 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.496141 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-67xsd" event={"ID":"beb65949-bc67-4f16-892c-8979cc412e9e","Type":"ContainerStarted","Data":"0d6d2123137d95d9b20f9108485e09a4ea356c3263f9ce2f35645928fd2a65b8"} Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.496160 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-67xsd" event={"ID":"beb65949-bc67-4f16-892c-8979cc412e9e","Type":"ContainerStarted","Data":"e368b7b545c2ad30fb874075c1946672383c107f30f860245f247056843c02f9"} Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.496908 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerStarted","Data":"5a3a8eeb171d1a5a3a7498930736cd0ccae19ece3c89aeb1103398b3ea192e10"} Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.498766 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" event={"ID":"bcc5161a-6f59-4878-a2ae-5f4a533021c3","Type":"ContainerStarted","Data":"436b99c533630bca41c7000cf75dcf81569a4dfd74983e70a60984986292c9fd"} Feb 25 11:07:51 crc kubenswrapper[4725]: I0225 11:07:51.523236 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-67xsd" podStartSLOduration=1.523215529 podStartE2EDuration="1.523215529s" podCreationTimestamp="2026-02-25 11:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:07:51.519313115 +0000 UTC m=+897.017895150" watchObservedRunningTime="2026-02-25 11:07:51.523215529 +0000 UTC m=+897.021797574" Feb 25 11:07:52 crc kubenswrapper[4725]: I0225 11:07:52.165975 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:52 crc kubenswrapper[4725]: I0225 11:07:52.171458 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b577777e-718a-4f09-a76a-98aa4f068184-memberlist\") pod \"speaker-svwnh\" (UID: \"b577777e-718a-4f09-a76a-98aa4f068184\") " pod="metallb-system/speaker-svwnh" Feb 25 11:07:52 crc kubenswrapper[4725]: I0225 11:07:52.301462 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-svwnh" Feb 25 11:07:52 crc kubenswrapper[4725]: W0225 11:07:52.344161 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb577777e_718a_4f09_a76a_98aa4f068184.slice/crio-593f600d7bdf9cbbaa469c6ad2b008d16c677e599d6a0f0d2230aacf715f6f29 WatchSource:0}: Error finding container 593f600d7bdf9cbbaa469c6ad2b008d16c677e599d6a0f0d2230aacf715f6f29: Status 404 returned error can't find the container with id 593f600d7bdf9cbbaa469c6ad2b008d16c677e599d6a0f0d2230aacf715f6f29 Feb 25 11:07:52 crc kubenswrapper[4725]: I0225 11:07:52.505394 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-svwnh" event={"ID":"b577777e-718a-4f09-a76a-98aa4f068184","Type":"ContainerStarted","Data":"593f600d7bdf9cbbaa469c6ad2b008d16c677e599d6a0f0d2230aacf715f6f29"} Feb 25 11:07:53 crc kubenswrapper[4725]: I0225 11:07:53.514237 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-svwnh" event={"ID":"b577777e-718a-4f09-a76a-98aa4f068184","Type":"ContainerStarted","Data":"0188a86c8f7942bb9cbc58fd042a20f286ccc22f75d5d4f6f8d2ebb313e7fa62"} Feb 25 11:07:54 crc kubenswrapper[4725]: I0225 11:07:54.543538 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-svwnh" event={"ID":"b577777e-718a-4f09-a76a-98aa4f068184","Type":"ContainerStarted","Data":"4f12f7089edf9d31f77f98366d5398007d640755965edc01250b804581dfd6a2"} Feb 25 11:07:54 crc kubenswrapper[4725]: I0225 11:07:54.544118 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-svwnh" Feb 25 11:07:54 crc kubenswrapper[4725]: I0225 11:07:54.571393 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-svwnh" podStartSLOduration=4.5713712 podStartE2EDuration="4.5713712s" podCreationTimestamp="2026-02-25 11:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:07:54.56537352 +0000 UTC m=+900.063955545" watchObservedRunningTime="2026-02-25 11:07:54.5713712 +0000 UTC m=+900.069953225" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.181785 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533628-ghwlf"] Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.183275 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533628-ghwlf" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.184431 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533628-ghwlf"] Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.215888 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bhp5\" (UniqueName: \"kubernetes.io/projected/b9c2271b-a00c-41f0-a976-ce403bdb24ae-kube-api-access-6bhp5\") pod \"auto-csr-approver-29533628-ghwlf\" (UID: \"b9c2271b-a00c-41f0-a976-ce403bdb24ae\") " pod="openshift-infra/auto-csr-approver-29533628-ghwlf" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.218398 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.218458 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.218755 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.317206 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhp5\" (UniqueName: \"kubernetes.io/projected/b9c2271b-a00c-41f0-a976-ce403bdb24ae-kube-api-access-6bhp5\") pod \"auto-csr-approver-29533628-ghwlf\" (UID: \"b9c2271b-a00c-41f0-a976-ce403bdb24ae\") " pod="openshift-infra/auto-csr-approver-29533628-ghwlf" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.336639 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhp5\" (UniqueName: \"kubernetes.io/projected/b9c2271b-a00c-41f0-a976-ce403bdb24ae-kube-api-access-6bhp5\") pod \"auto-csr-approver-29533628-ghwlf\" (UID: \"b9c2271b-a00c-41f0-a976-ce403bdb24ae\") " pod="openshift-infra/auto-csr-approver-29533628-ghwlf" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.536638 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533628-ghwlf" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.587813 4725 generic.go:334] "Generic (PLEG): container finished" podID="0ced2390-9bb3-44f1-a851-994322d83bff" containerID="74c95651b433e16f0699f9d25be1085c069eb578e5fd4513e9b655e25b933a11" exitCode=0 Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.587968 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerDied","Data":"74c95651b433e16f0699f9d25be1085c069eb578e5fd4513e9b655e25b933a11"} Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.591519 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" event={"ID":"bcc5161a-6f59-4878-a2ae-5f4a533021c3","Type":"ContainerStarted","Data":"95a1ec882c10244f6226dd67f8e9e256a9c7ad3167c6eb1b9e567f6a0f156735"} Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.591729 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.641448 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" podStartSLOduration=1.867739939 podStartE2EDuration="10.641427839s" podCreationTimestamp="2026-02-25 11:07:50 +0000 UTC" firstStartedPulling="2026-02-25 11:07:50.920085433 +0000 UTC m=+896.418667458" lastFinishedPulling="2026-02-25 11:07:59.693773293 +0000 UTC m=+905.192355358" observedRunningTime="2026-02-25 11:08:00.633954619 +0000 UTC m=+906.132536674" watchObservedRunningTime="2026-02-25 11:08:00.641427839 +0000 UTC m=+906.140009884" Feb 25 11:08:00 crc kubenswrapper[4725]: I0225 11:08:00.977297 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533628-ghwlf"] Feb 25 11:08:00 crc kubenswrapper[4725]: W0225 11:08:00.985760 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9c2271b_a00c_41f0_a976_ce403bdb24ae.slice/crio-31ba372f05256b65ade2ba82996ecf4b1c67e9eb1c1cf3bcfab7546ea102e4a0 WatchSource:0}: Error finding container 31ba372f05256b65ade2ba82996ecf4b1c67e9eb1c1cf3bcfab7546ea102e4a0: Status 404 returned error can't find the container with id 31ba372f05256b65ade2ba82996ecf4b1c67e9eb1c1cf3bcfab7546ea102e4a0 Feb 25 11:08:01 crc kubenswrapper[4725]: I0225 11:08:01.600132 4725 generic.go:334] "Generic (PLEG): container finished" podID="0ced2390-9bb3-44f1-a851-994322d83bff" containerID="d05713b507c4531adcac3948c62d5bc3ed48dfc0ebfcedb5eb9680e2443174ca" exitCode=0 Feb 25 11:08:01 crc kubenswrapper[4725]: I0225 11:08:01.600345 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerDied","Data":"d05713b507c4531adcac3948c62d5bc3ed48dfc0ebfcedb5eb9680e2443174ca"} Feb 25 11:08:01 crc kubenswrapper[4725]: I0225 11:08:01.603667 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533628-ghwlf" event={"ID":"b9c2271b-a00c-41f0-a976-ce403bdb24ae","Type":"ContainerStarted","Data":"31ba372f05256b65ade2ba82996ecf4b1c67e9eb1c1cf3bcfab7546ea102e4a0"} Feb 25 11:08:02 crc kubenswrapper[4725]: I0225 11:08:02.305592 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-svwnh" Feb 25 11:08:02 crc kubenswrapper[4725]: I0225 11:08:02.615049 4725 generic.go:334] "Generic (PLEG): container finished" podID="0ced2390-9bb3-44f1-a851-994322d83bff" containerID="1174fade2c33e4f2b754ecec25b6b4ccc06e1c0c0de954cf45bfdda0b5270843" exitCode=0 Feb 25 11:08:02 crc kubenswrapper[4725]: I0225 11:08:02.615143 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerDied","Data":"1174fade2c33e4f2b754ecec25b6b4ccc06e1c0c0de954cf45bfdda0b5270843"} Feb 25 11:08:03 crc kubenswrapper[4725]: I0225 11:08:03.630650 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerStarted","Data":"73acbb8e603cb7a4ed8c2cd7c1727f5036414c2e0017d3e3f80aae6c4bee3e88"} Feb 25 11:08:03 crc kubenswrapper[4725]: I0225 11:08:03.631033 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerStarted","Data":"aeb709708dadf55ab2188a875c26f7a0a897a42cdec1b81b5490dcd794a4c072"} Feb 25 11:08:03 crc kubenswrapper[4725]: I0225 11:08:03.631045 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerStarted","Data":"37fe9fdc55f4fd6b5a2e80b0590978697fd0e3c483a1ff64d4d709dd27f05417"} Feb 25 11:08:03 crc kubenswrapper[4725]: I0225 11:08:03.631054 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerStarted","Data":"a0d4d942741fcdafa3e32245e03c7c790a0887475a4796a6c382e95a1e281ec6"} Feb 25 11:08:03 crc kubenswrapper[4725]: I0225 11:08:03.637979 4725 generic.go:334] "Generic (PLEG): container finished" podID="b9c2271b-a00c-41f0-a976-ce403bdb24ae" containerID="84d1d9149f6f72c7e26d69054b4a1c741ccdcebea3256a545f06bfb0edf268c2" exitCode=0 Feb 25 11:08:03 crc kubenswrapper[4725]: I0225 11:08:03.638008 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533628-ghwlf" event={"ID":"b9c2271b-a00c-41f0-a976-ce403bdb24ae","Type":"ContainerDied","Data":"84d1d9149f6f72c7e26d69054b4a1c741ccdcebea3256a545f06bfb0edf268c2"} Feb 25 11:08:04 crc kubenswrapper[4725]: I0225 11:08:04.648555 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerStarted","Data":"007c7828c3263091227dfaf1e1f5ba7398151e7c4fe6a3d3b57022dbb0192fc4"} Feb 25 11:08:04 crc kubenswrapper[4725]: I0225 11:08:04.648918 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:08:04 crc kubenswrapper[4725]: I0225 11:08:04.648930 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjqx6" event={"ID":"0ced2390-9bb3-44f1-a851-994322d83bff","Type":"ContainerStarted","Data":"f82d285cc4271928cb93f54ee18d6daad20a735043dc5ccaaa6ba184ef679ba9"} Feb 25 11:08:04 crc kubenswrapper[4725]: I0225 11:08:04.679417 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hjqx6" podStartSLOduration=6.048200554 podStartE2EDuration="14.679398758s" podCreationTimestamp="2026-02-25 11:07:50 +0000 UTC" firstStartedPulling="2026-02-25 11:07:51.015058077 +0000 UTC m=+896.513640092" lastFinishedPulling="2026-02-25 11:07:59.646256261 +0000 UTC m=+905.144838296" observedRunningTime="2026-02-25 11:08:04.679194233 +0000 UTC m=+910.177776348" watchObservedRunningTime="2026-02-25 11:08:04.679398758 +0000 UTC m=+910.177980793" Feb 25 11:08:04 crc kubenswrapper[4725]: I0225 11:08:04.961271 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533628-ghwlf" Feb 25 11:08:04 crc kubenswrapper[4725]: I0225 11:08:04.987057 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bhp5\" (UniqueName: \"kubernetes.io/projected/b9c2271b-a00c-41f0-a976-ce403bdb24ae-kube-api-access-6bhp5\") pod \"b9c2271b-a00c-41f0-a976-ce403bdb24ae\" (UID: \"b9c2271b-a00c-41f0-a976-ce403bdb24ae\") " Feb 25 11:08:04 crc kubenswrapper[4725]: I0225 11:08:04.993263 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c2271b-a00c-41f0-a976-ce403bdb24ae-kube-api-access-6bhp5" (OuterVolumeSpecName: "kube-api-access-6bhp5") pod "b9c2271b-a00c-41f0-a976-ce403bdb24ae" (UID: "b9c2271b-a00c-41f0-a976-ce403bdb24ae"). InnerVolumeSpecName "kube-api-access-6bhp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.089262 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bhp5\" (UniqueName: \"kubernetes.io/projected/b9c2271b-a00c-41f0-a976-ce403bdb24ae-kube-api-access-6bhp5\") on node \"crc\" DevicePath \"\"" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.127026 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wbbsw"] Feb 25 11:08:05 crc kubenswrapper[4725]: E0225 11:08:05.127459 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c2271b-a00c-41f0-a976-ce403bdb24ae" containerName="oc" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.127501 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c2271b-a00c-41f0-a976-ce403bdb24ae" containerName="oc" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.127795 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c2271b-a00c-41f0-a976-ce403bdb24ae" containerName="oc" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.128721 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wbbsw" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.132728 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wbbsw"] Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.134406 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fkf8p" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.136794 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.136882 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.190266 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7w9q\" (UniqueName: \"kubernetes.io/projected/34bde9aa-15e3-4428-8a1a-daa93fd8c8b1-kube-api-access-q7w9q\") pod \"openstack-operator-index-wbbsw\" (UID: \"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1\") " pod="openstack-operators/openstack-operator-index-wbbsw" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.291712 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7w9q\" (UniqueName: \"kubernetes.io/projected/34bde9aa-15e3-4428-8a1a-daa93fd8c8b1-kube-api-access-q7w9q\") pod \"openstack-operator-index-wbbsw\" (UID: \"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1\") " pod="openstack-operators/openstack-operator-index-wbbsw" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.307757 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7w9q\" (UniqueName: \"kubernetes.io/projected/34bde9aa-15e3-4428-8a1a-daa93fd8c8b1-kube-api-access-q7w9q\") pod \"openstack-operator-index-wbbsw\" (UID: \"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1\") " pod="openstack-operators/openstack-operator-index-wbbsw" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.445934 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wbbsw" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.663223 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533628-ghwlf" event={"ID":"b9c2271b-a00c-41f0-a976-ce403bdb24ae","Type":"ContainerDied","Data":"31ba372f05256b65ade2ba82996ecf4b1c67e9eb1c1cf3bcfab7546ea102e4a0"} Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.663271 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ba372f05256b65ade2ba82996ecf4b1c67e9eb1c1cf3bcfab7546ea102e4a0" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.663238 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533628-ghwlf" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.711655 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.745343 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:08:05 crc kubenswrapper[4725]: I0225 11:08:05.922780 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wbbsw"] Feb 25 11:08:06 crc kubenswrapper[4725]: I0225 11:08:06.031380 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533622-82gpd"] Feb 25 11:08:06 crc kubenswrapper[4725]: I0225 11:08:06.040027 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533622-82gpd"] Feb 25 11:08:06 crc kubenswrapper[4725]: I0225 11:08:06.673970 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wbbsw" event={"ID":"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1","Type":"ContainerStarted","Data":"2a2729877fcb915e525ecaff2cbd2b1b6c043856679184f50d0dd0f2d55e0a29"} Feb 25 11:08:07 crc kubenswrapper[4725]: I0225 11:08:07.232647 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a880511c-5d78-4ce1-8c43-ecefd558e91c" path="/var/lib/kubelet/pods/a880511c-5d78-4ce1-8c43-ecefd558e91c/volumes" Feb 25 11:08:08 crc kubenswrapper[4725]: I0225 11:08:08.497777 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wbbsw"] Feb 25 11:08:08 crc kubenswrapper[4725]: I0225 11:08:08.688803 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wbbsw" event={"ID":"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1","Type":"ContainerStarted","Data":"43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5"} Feb 25 11:08:08 crc kubenswrapper[4725]: I0225 11:08:08.688972 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wbbsw" podUID="34bde9aa-15e3-4428-8a1a-daa93fd8c8b1" containerName="registry-server" containerID="cri-o://43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5" gracePeriod=2 Feb 25 11:08:08 crc kubenswrapper[4725]: I0225 11:08:08.709755 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wbbsw" podStartSLOduration=1.2771257089999999 podStartE2EDuration="3.709738294s" podCreationTimestamp="2026-02-25 11:08:05 +0000 UTC" firstStartedPulling="2026-02-25 11:08:05.938933824 +0000 UTC m=+911.437515849" lastFinishedPulling="2026-02-25 11:08:08.371546379 +0000 UTC m=+913.870128434" observedRunningTime="2026-02-25 11:08:08.70768733 +0000 UTC m=+914.206269355" watchObservedRunningTime="2026-02-25 11:08:08.709738294 +0000 UTC m=+914.208320339" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.115380 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sn5wb"] Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.116684 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.127469 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wbbsw" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.131431 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sn5wb"] Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.143326 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kckk7\" (UniqueName: \"kubernetes.io/projected/52de4181-d70f-4961-abfe-957862ec7ed0-kube-api-access-kckk7\") pod \"openstack-operator-index-sn5wb\" (UID: \"52de4181-d70f-4961-abfe-957862ec7ed0\") " pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.245079 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7w9q\" (UniqueName: \"kubernetes.io/projected/34bde9aa-15e3-4428-8a1a-daa93fd8c8b1-kube-api-access-q7w9q\") pod \"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1\" (UID: \"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1\") " Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.245435 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kckk7\" (UniqueName: \"kubernetes.io/projected/52de4181-d70f-4961-abfe-957862ec7ed0-kube-api-access-kckk7\") pod \"openstack-operator-index-sn5wb\" (UID: \"52de4181-d70f-4961-abfe-957862ec7ed0\") " pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.252424 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bde9aa-15e3-4428-8a1a-daa93fd8c8b1-kube-api-access-q7w9q" (OuterVolumeSpecName: "kube-api-access-q7w9q") pod "34bde9aa-15e3-4428-8a1a-daa93fd8c8b1" (UID: "34bde9aa-15e3-4428-8a1a-daa93fd8c8b1"). InnerVolumeSpecName "kube-api-access-q7w9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.261780 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kckk7\" (UniqueName: \"kubernetes.io/projected/52de4181-d70f-4961-abfe-957862ec7ed0-kube-api-access-kckk7\") pod \"openstack-operator-index-sn5wb\" (UID: \"52de4181-d70f-4961-abfe-957862ec7ed0\") " pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.346849 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7w9q\" (UniqueName: \"kubernetes.io/projected/34bde9aa-15e3-4428-8a1a-daa93fd8c8b1-kube-api-access-q7w9q\") on node \"crc\" DevicePath \"\"" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.438309 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.702682 4725 generic.go:334] "Generic (PLEG): container finished" podID="34bde9aa-15e3-4428-8a1a-daa93fd8c8b1" containerID="43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5" exitCode=0 Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.702892 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wbbsw" event={"ID":"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1","Type":"ContainerDied","Data":"43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5"} Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.703208 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wbbsw" event={"ID":"34bde9aa-15e3-4428-8a1a-daa93fd8c8b1","Type":"ContainerDied","Data":"2a2729877fcb915e525ecaff2cbd2b1b6c043856679184f50d0dd0f2d55e0a29"} Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.703242 4725 scope.go:117] "RemoveContainer" containerID="43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.702983 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wbbsw" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.733507 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sn5wb"] Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.734062 4725 scope.go:117] "RemoveContainer" containerID="43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5" Feb 25 11:08:09 crc kubenswrapper[4725]: E0225 11:08:09.734453 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5\": container with ID starting with 43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5 not found: ID does not exist" containerID="43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5" Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.734499 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5"} err="failed to get container status \"43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5\": rpc error: code = NotFound desc = could not find container \"43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5\": container with ID starting with 43c19993ef7def9b67f02ccf5b903bd6b04465807ca75a62acdb52463e0e79e5 not found: ID does not exist" Feb 25 11:08:09 crc kubenswrapper[4725]: W0225 11:08:09.738600 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52de4181_d70f_4961_abfe_957862ec7ed0.slice/crio-afea24e8248e4ec9a6e0ccecc09e73efb3c2bd1d8df36e24d7ab9c70280a8726 WatchSource:0}: Error finding container afea24e8248e4ec9a6e0ccecc09e73efb3c2bd1d8df36e24d7ab9c70280a8726: Status 404 returned error can't find the container with id afea24e8248e4ec9a6e0ccecc09e73efb3c2bd1d8df36e24d7ab9c70280a8726 Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.749943 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wbbsw"] Feb 25 11:08:09 crc kubenswrapper[4725]: I0225 11:08:09.755342 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wbbsw"] Feb 25 11:08:10 crc kubenswrapper[4725]: I0225 11:08:10.711017 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sn5wb" event={"ID":"52de4181-d70f-4961-abfe-957862ec7ed0","Type":"ContainerStarted","Data":"1c1da0da988864a3a188c72a8e2e6d0e1833b7ce96d702d58a0d8939f71bdbd0"} Feb 25 11:08:10 crc kubenswrapper[4725]: I0225 11:08:10.711388 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sn5wb" event={"ID":"52de4181-d70f-4961-abfe-957862ec7ed0","Type":"ContainerStarted","Data":"afea24e8248e4ec9a6e0ccecc09e73efb3c2bd1d8df36e24d7ab9c70280a8726"} Feb 25 11:08:10 crc kubenswrapper[4725]: I0225 11:08:10.729168 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sn5wb" podStartSLOduration=1.6485976679999998 podStartE2EDuration="1.729120984s" podCreationTimestamp="2026-02-25 11:08:09 +0000 UTC" firstStartedPulling="2026-02-25 11:08:09.748355881 +0000 UTC m=+915.246937936" lastFinishedPulling="2026-02-25 11:08:09.828879207 +0000 UTC m=+915.327461252" observedRunningTime="2026-02-25 11:08:10.728988451 +0000 UTC m=+916.227570486" watchObservedRunningTime="2026-02-25 11:08:10.729120984 +0000 UTC m=+916.227703009" Feb 25 11:08:10 crc kubenswrapper[4725]: I0225 11:08:10.731898 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xn4fl" Feb 25 11:08:10 crc kubenswrapper[4725]: I0225 11:08:10.815163 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-67xsd" Feb 25 11:08:11 crc kubenswrapper[4725]: I0225 11:08:11.237467 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34bde9aa-15e3-4428-8a1a-daa93fd8c8b1" path="/var/lib/kubelet/pods/34bde9aa-15e3-4428-8a1a-daa93fd8c8b1/volumes" Feb 25 11:08:15 crc kubenswrapper[4725]: I0225 11:08:15.109612 4725 scope.go:117] "RemoveContainer" containerID="62ceca1f770affcb8a1fdea855d8224741bdd2ad16b49b27ffe2b8f0983b396e" Feb 25 11:08:19 crc kubenswrapper[4725]: I0225 11:08:19.439498 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:19 crc kubenswrapper[4725]: I0225 11:08:19.440449 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:19 crc kubenswrapper[4725]: I0225 11:08:19.483887 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:19 crc kubenswrapper[4725]: I0225 11:08:19.839417 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-sn5wb" Feb 25 11:08:20 crc kubenswrapper[4725]: I0225 11:08:20.717034 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hjqx6" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.727211 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k"] Feb 25 11:08:26 crc kubenswrapper[4725]: E0225 11:08:26.728381 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bde9aa-15e3-4428-8a1a-daa93fd8c8b1" containerName="registry-server" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.728408 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bde9aa-15e3-4428-8a1a-daa93fd8c8b1" containerName="registry-server" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.728624 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bde9aa-15e3-4428-8a1a-daa93fd8c8b1" containerName="registry-server" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.730129 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.735166 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4czsq" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.739764 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k"] Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.924487 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-util\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.924543 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtn4m\" (UniqueName: \"kubernetes.io/projected/f00c3456-1352-4fa0-90e7-44648edcf473-kube-api-access-jtn4m\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:26 crc kubenswrapper[4725]: I0225 11:08:26.924692 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-bundle\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.025966 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-util\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.026051 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtn4m\" (UniqueName: \"kubernetes.io/projected/f00c3456-1352-4fa0-90e7-44648edcf473-kube-api-access-jtn4m\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.026146 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-bundle\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.026894 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-bundle\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.026869 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-util\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.062114 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtn4m\" (UniqueName: \"kubernetes.io/projected/f00c3456-1352-4fa0-90e7-44648edcf473-kube-api-access-jtn4m\") pod \"f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.356180 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.663591 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k"] Feb 25 11:08:27 crc kubenswrapper[4725]: W0225 11:08:27.670734 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf00c3456_1352_4fa0_90e7_44648edcf473.slice/crio-e41f16e62c33c8d263360a1ba8c8814ecf104c2df5f23fdd158820b96b11da7d WatchSource:0}: Error finding container e41f16e62c33c8d263360a1ba8c8814ecf104c2df5f23fdd158820b96b11da7d: Status 404 returned error can't find the container with id e41f16e62c33c8d263360a1ba8c8814ecf104c2df5f23fdd158820b96b11da7d Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.916078 4725 generic.go:334] "Generic (PLEG): container finished" podID="f00c3456-1352-4fa0-90e7-44648edcf473" containerID="b5e87d56b196b00c1c3b3912b4427dcd9a5c61b2a0f19b747aff1d7a9b3b2a10" exitCode=0 Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.916147 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" event={"ID":"f00c3456-1352-4fa0-90e7-44648edcf473","Type":"ContainerDied","Data":"b5e87d56b196b00c1c3b3912b4427dcd9a5c61b2a0f19b747aff1d7a9b3b2a10"} Feb 25 11:08:27 crc kubenswrapper[4725]: I0225 11:08:27.916220 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" event={"ID":"f00c3456-1352-4fa0-90e7-44648edcf473","Type":"ContainerStarted","Data":"e41f16e62c33c8d263360a1ba8c8814ecf104c2df5f23fdd158820b96b11da7d"} Feb 25 11:08:28 crc kubenswrapper[4725]: I0225 11:08:28.927198 4725 generic.go:334] "Generic (PLEG): container finished" podID="f00c3456-1352-4fa0-90e7-44648edcf473" containerID="c66df4dec9eff67042b2ebc47d8de4c9b877a2beadce5698780cc10a76003745" exitCode=0 Feb 25 11:08:28 crc kubenswrapper[4725]: I0225 11:08:28.927253 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" event={"ID":"f00c3456-1352-4fa0-90e7-44648edcf473","Type":"ContainerDied","Data":"c66df4dec9eff67042b2ebc47d8de4c9b877a2beadce5698780cc10a76003745"} Feb 25 11:08:29 crc kubenswrapper[4725]: I0225 11:08:29.941608 4725 generic.go:334] "Generic (PLEG): container finished" podID="f00c3456-1352-4fa0-90e7-44648edcf473" containerID="6da79fa741b817c0e150d2285779382f8e6212a8295a96844acc39cc9c456ac4" exitCode=0 Feb 25 11:08:29 crc kubenswrapper[4725]: I0225 11:08:29.941686 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" event={"ID":"f00c3456-1352-4fa0-90e7-44648edcf473","Type":"ContainerDied","Data":"6da79fa741b817c0e150d2285779382f8e6212a8295a96844acc39cc9c456ac4"} Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.292775 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.491855 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-bundle\") pod \"f00c3456-1352-4fa0-90e7-44648edcf473\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.492067 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtn4m\" (UniqueName: \"kubernetes.io/projected/f00c3456-1352-4fa0-90e7-44648edcf473-kube-api-access-jtn4m\") pod \"f00c3456-1352-4fa0-90e7-44648edcf473\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.492108 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-util\") pod \"f00c3456-1352-4fa0-90e7-44648edcf473\" (UID: \"f00c3456-1352-4fa0-90e7-44648edcf473\") " Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.493556 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-bundle" (OuterVolumeSpecName: "bundle") pod "f00c3456-1352-4fa0-90e7-44648edcf473" (UID: "f00c3456-1352-4fa0-90e7-44648edcf473"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.500406 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f00c3456-1352-4fa0-90e7-44648edcf473-kube-api-access-jtn4m" (OuterVolumeSpecName: "kube-api-access-jtn4m") pod "f00c3456-1352-4fa0-90e7-44648edcf473" (UID: "f00c3456-1352-4fa0-90e7-44648edcf473"). InnerVolumeSpecName "kube-api-access-jtn4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.525996 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-util" (OuterVolumeSpecName: "util") pod "f00c3456-1352-4fa0-90e7-44648edcf473" (UID: "f00c3456-1352-4fa0-90e7-44648edcf473"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.593548 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtn4m\" (UniqueName: \"kubernetes.io/projected/f00c3456-1352-4fa0-90e7-44648edcf473-kube-api-access-jtn4m\") on node \"crc\" DevicePath \"\"" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.593586 4725 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-util\") on node \"crc\" DevicePath \"\"" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.593600 4725 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00c3456-1352-4fa0-90e7-44648edcf473-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.960969 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" event={"ID":"f00c3456-1352-4fa0-90e7-44648edcf473","Type":"ContainerDied","Data":"e41f16e62c33c8d263360a1ba8c8814ecf104c2df5f23fdd158820b96b11da7d"} Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.961282 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e41f16e62c33c8d263360a1ba8c8814ecf104c2df5f23fdd158820b96b11da7d" Feb 25 11:08:31 crc kubenswrapper[4725]: I0225 11:08:31.961293 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.868633 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj"] Feb 25 11:08:38 crc kubenswrapper[4725]: E0225 11:08:38.868982 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00c3456-1352-4fa0-90e7-44648edcf473" containerName="util" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.869001 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00c3456-1352-4fa0-90e7-44648edcf473" containerName="util" Feb 25 11:08:38 crc kubenswrapper[4725]: E0225 11:08:38.869024 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00c3456-1352-4fa0-90e7-44648edcf473" containerName="pull" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.869058 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00c3456-1352-4fa0-90e7-44648edcf473" containerName="pull" Feb 25 11:08:38 crc kubenswrapper[4725]: E0225 11:08:38.869070 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00c3456-1352-4fa0-90e7-44648edcf473" containerName="extract" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.869080 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00c3456-1352-4fa0-90e7-44648edcf473" containerName="extract" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.869237 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00c3456-1352-4fa0-90e7-44648edcf473" containerName="extract" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.869850 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.873553 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-q6rqg" Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.885396 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj"] Feb 25 11:08:38 crc kubenswrapper[4725]: I0225 11:08:38.998251 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrmq\" (UniqueName: \"kubernetes.io/projected/267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe-kube-api-access-8hrmq\") pod \"openstack-operator-controller-init-74c9788cdf-zqhdj\" (UID: \"267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe\") " pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" Feb 25 11:08:39 crc kubenswrapper[4725]: I0225 11:08:39.100713 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrmq\" (UniqueName: \"kubernetes.io/projected/267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe-kube-api-access-8hrmq\") pod \"openstack-operator-controller-init-74c9788cdf-zqhdj\" (UID: \"267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe\") " pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" Feb 25 11:08:39 crc kubenswrapper[4725]: I0225 11:08:39.122428 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hrmq\" (UniqueName: \"kubernetes.io/projected/267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe-kube-api-access-8hrmq\") pod \"openstack-operator-controller-init-74c9788cdf-zqhdj\" (UID: \"267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe\") " pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" Feb 25 11:08:39 crc kubenswrapper[4725]: I0225 11:08:39.211916 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" Feb 25 11:08:39 crc kubenswrapper[4725]: I0225 11:08:39.704357 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj"] Feb 25 11:08:40 crc kubenswrapper[4725]: I0225 11:08:40.029164 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" event={"ID":"267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe","Type":"ContainerStarted","Data":"d60c0af8dce98fba8f523e51fb5ae244ac4b47c12dbfc1c3698d5411ce2547df"} Feb 25 11:08:44 crc kubenswrapper[4725]: I0225 11:08:44.057571 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" event={"ID":"267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe","Type":"ContainerStarted","Data":"2b4ca8fc641996c0466ccd3909c67f5f0efd116dc8424c776073bd1956eb35c1"} Feb 25 11:08:44 crc kubenswrapper[4725]: I0225 11:08:44.058459 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" Feb 25 11:08:44 crc kubenswrapper[4725]: I0225 11:08:44.092224 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" podStartSLOduration=2.362918507 podStartE2EDuration="6.092204968s" podCreationTimestamp="2026-02-25 11:08:38 +0000 UTC" firstStartedPulling="2026-02-25 11:08:39.711009549 +0000 UTC m=+945.209591574" lastFinishedPulling="2026-02-25 11:08:43.44029601 +0000 UTC m=+948.938878035" observedRunningTime="2026-02-25 11:08:44.089963128 +0000 UTC m=+949.588545193" watchObservedRunningTime="2026-02-25 11:08:44.092204968 +0000 UTC m=+949.590786993" Feb 25 11:08:49 crc kubenswrapper[4725]: I0225 11:08:49.216029 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-74c9788cdf-zqhdj" Feb 25 11:09:11 crc kubenswrapper[4725]: I0225 11:09:11.556059 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:09:11 crc kubenswrapper[4725]: I0225 11:09:11.556764 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.604206 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zhxt8"] Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.606240 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.628419 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhxt8"] Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.744546 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rzx\" (UniqueName: \"kubernetes.io/projected/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-kube-api-access-j6rzx\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.744624 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-utilities\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.744752 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-catalog-content\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.845998 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-catalog-content\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.846075 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rzx\" (UniqueName: \"kubernetes.io/projected/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-kube-api-access-j6rzx\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.846107 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-utilities\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.846518 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-catalog-content\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.846544 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-utilities\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.868661 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rzx\" (UniqueName: \"kubernetes.io/projected/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-kube-api-access-j6rzx\") pod \"certified-operators-zhxt8\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:24 crc kubenswrapper[4725]: I0225 11:09:24.928352 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:25 crc kubenswrapper[4725]: I0225 11:09:25.357236 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhxt8"] Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.344983 4725 generic.go:334] "Generic (PLEG): container finished" podID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerID="1d4e50a85ac8215f908d97d23d7efdf33c1ac13931a7780aa2d8780b5305210a" exitCode=0 Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.345058 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhxt8" event={"ID":"c0cca41d-a3cc-4060-becc-ba00a60dd9bc","Type":"ContainerDied","Data":"1d4e50a85ac8215f908d97d23d7efdf33c1ac13931a7780aa2d8780b5305210a"} Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.345695 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhxt8" event={"ID":"c0cca41d-a3cc-4060-becc-ba00a60dd9bc","Type":"ContainerStarted","Data":"1f8035c9d3fc9422fe0d939a32c402b4e96670a482ba045605dba19dc4d4d20d"} Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.898893 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-l278b"] Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.899737 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.901178 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-kth85" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.916087 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv"] Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.917214 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.925166 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-rrjgm" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.930232 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c"] Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.931032 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.934214 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-8hms2" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.938094 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-l278b"] Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.944026 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv"] Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.969502 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26"] Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.970271 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.973946 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74j4c\" (UniqueName: \"kubernetes.io/projected/27540507-aac9-4fd2-84a9-34a2a20885d7-kube-api-access-74j4c\") pod \"cinder-operator-controller-manager-55d77d7b5c-65rfv\" (UID: \"27540507-aac9-4fd2-84a9-34a2a20885d7\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.978674 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-f52l4" Feb 25 11:09:26 crc kubenswrapper[4725]: I0225 11:09:26.979574 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.036128 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.073134 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.075913 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvvpt\" (UniqueName: \"kubernetes.io/projected/a897851d-6b6d-40e1-82f2-ef4db97b19d9-kube-api-access-dvvpt\") pod \"designate-operator-controller-manager-6d8bf5c495-gm94c\" (UID: \"a897851d-6b6d-40e1-82f2-ef4db97b19d9\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.075986 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74j4c\" (UniqueName: \"kubernetes.io/projected/27540507-aac9-4fd2-84a9-34a2a20885d7-kube-api-access-74j4c\") pod \"cinder-operator-controller-manager-55d77d7b5c-65rfv\" (UID: \"27540507-aac9-4fd2-84a9-34a2a20885d7\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.076011 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vhn\" (UniqueName: \"kubernetes.io/projected/41775582-fd78-4c34-93fc-60b9cdc55a2c-kube-api-access-n8vhn\") pod \"barbican-operator-controller-manager-868647ff47-l278b\" (UID: \"41775582-fd78-4c34-93fc-60b9cdc55a2c\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.076053 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgphr\" (UniqueName: \"kubernetes.io/projected/22854bfa-3684-4750-b2f7-e5ccbe3e92fb-kube-api-access-kgphr\") pod \"glance-operator-controller-manager-784b5bb6c5-97g26\" (UID: \"22854bfa-3684-4750-b2f7-e5ccbe3e92fb\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.079363 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.083507 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.083899 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.086697 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qhs6h" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.087227 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gpql4" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.108960 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74j4c\" (UniqueName: \"kubernetes.io/projected/27540507-aac9-4fd2-84a9-34a2a20885d7-kube-api-access-74j4c\") pod \"cinder-operator-controller-manager-55d77d7b5c-65rfv\" (UID: \"27540507-aac9-4fd2-84a9-34a2a20885d7\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.116727 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-6872z"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.117473 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.119441 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.119651 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8bx74" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.151875 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.167946 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.180305 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vhn\" (UniqueName: \"kubernetes.io/projected/41775582-fd78-4c34-93fc-60b9cdc55a2c-kube-api-access-n8vhn\") pod \"barbican-operator-controller-manager-868647ff47-l278b\" (UID: \"41775582-fd78-4c34-93fc-60b9cdc55a2c\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.180397 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgphr\" (UniqueName: \"kubernetes.io/projected/22854bfa-3684-4750-b2f7-e5ccbe3e92fb-kube-api-access-kgphr\") pod \"glance-operator-controller-manager-784b5bb6c5-97g26\" (UID: \"22854bfa-3684-4750-b2f7-e5ccbe3e92fb\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.180437 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvvpt\" (UniqueName: \"kubernetes.io/projected/a897851d-6b6d-40e1-82f2-ef4db97b19d9-kube-api-access-dvvpt\") pod \"designate-operator-controller-manager-6d8bf5c495-gm94c\" (UID: \"a897851d-6b6d-40e1-82f2-ef4db97b19d9\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.202737 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-6872z"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.235619 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvvpt\" (UniqueName: \"kubernetes.io/projected/a897851d-6b6d-40e1-82f2-ef4db97b19d9-kube-api-access-dvvpt\") pod \"designate-operator-controller-manager-6d8bf5c495-gm94c\" (UID: \"a897851d-6b6d-40e1-82f2-ef4db97b19d9\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.235892 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.243011 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.243558 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vhn\" (UniqueName: \"kubernetes.io/projected/41775582-fd78-4c34-93fc-60b9cdc55a2c-kube-api-access-n8vhn\") pod \"barbican-operator-controller-manager-868647ff47-l278b\" (UID: \"41775582-fd78-4c34-93fc-60b9cdc55a2c\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.246318 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.246658 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-q4vrs" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.249432 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgphr\" (UniqueName: \"kubernetes.io/projected/22854bfa-3684-4750-b2f7-e5ccbe3e92fb-kube-api-access-kgphr\") pod \"glance-operator-controller-manager-784b5bb6c5-97g26\" (UID: \"22854bfa-3684-4750-b2f7-e5ccbe3e92fb\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.265177 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.271495 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.272124 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-25sql"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.273213 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.273728 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.274722 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hsctv" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.283459 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcl7z\" (UniqueName: \"kubernetes.io/projected/6cf86133-a9ef-4a8b-a957-ef8e588b200e-kube-api-access-kcl7z\") pod \"horizon-operator-controller-manager-5b9b8895d5-j4hbq\" (UID: \"6cf86133-a9ef-4a8b-a957-ef8e588b200e\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.283505 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbn7w\" (UniqueName: \"kubernetes.io/projected/0755d178-0ceb-41f1-a26c-e96e466f8300-kube-api-access-mbn7w\") pod \"heat-operator-controller-manager-69f49c598c-wj5dw\" (UID: \"0755d178-0ceb-41f1-a26c-e96e466f8300\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.283537 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.283575 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmcp7\" (UniqueName: \"kubernetes.io/projected/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-kube-api-access-lmcp7\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.287527 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-w27rn" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.294862 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-25sql"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.314516 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.320802 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.321629 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.323177 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wd4gh" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.344450 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.345189 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.347000 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.350202 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2hjb4" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.363956 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.386271 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmcp7\" (UniqueName: \"kubernetes.io/projected/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-kube-api-access-lmcp7\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.386332 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svrbl\" (UniqueName: \"kubernetes.io/projected/0279e1a1-c275-48e8-815c-0afae718b93a-kube-api-access-svrbl\") pod \"manila-operator-controller-manager-67d996989d-25sql\" (UID: \"0279e1a1-c275-48e8-815c-0afae718b93a\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.386357 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rnq\" (UniqueName: \"kubernetes.io/projected/015fdc09-2359-48f1-9800-9d44efc254fc-kube-api-access-z5rnq\") pod \"ironic-operator-controller-manager-554564d7fc-h2tmg\" (UID: \"015fdc09-2359-48f1-9800-9d44efc254fc\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.386388 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcl7z\" (UniqueName: \"kubernetes.io/projected/6cf86133-a9ef-4a8b-a957-ef8e588b200e-kube-api-access-kcl7z\") pod \"horizon-operator-controller-manager-5b9b8895d5-j4hbq\" (UID: \"6cf86133-a9ef-4a8b-a957-ef8e588b200e\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.386419 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbn7w\" (UniqueName: \"kubernetes.io/projected/0755d178-0ceb-41f1-a26c-e96e466f8300-kube-api-access-mbn7w\") pod \"heat-operator-controller-manager-69f49c598c-wj5dw\" (UID: \"0755d178-0ceb-41f1-a26c-e96e466f8300\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.386455 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hjbp\" (UniqueName: \"kubernetes.io/projected/9a7b2bf7-fab5-4634-9dfa-147dc2de21bc-kube-api-access-6hjbp\") pod \"keystone-operator-controller-manager-b4d948c87-2vhq7\" (UID: \"9a7b2bf7-fab5-4634-9dfa-147dc2de21bc\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.386476 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:27 crc kubenswrapper[4725]: E0225 11:09:27.386568 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:27 crc kubenswrapper[4725]: E0225 11:09:27.386608 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert podName:b82c26d2-a08f-4c57-a876-9ac8a87c1fcf nodeName:}" failed. No retries permitted until 2026-02-25 11:09:27.886594248 +0000 UTC m=+993.385176273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert") pod "infra-operator-controller-manager-79d975b745-6872z" (UID: "b82c26d2-a08f-4c57-a876-9ac8a87c1fcf") : secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.400489 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.412198 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbn7w\" (UniqueName: \"kubernetes.io/projected/0755d178-0ceb-41f1-a26c-e96e466f8300-kube-api-access-mbn7w\") pod \"heat-operator-controller-manager-69f49c598c-wj5dw\" (UID: \"0755d178-0ceb-41f1-a26c-e96e466f8300\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.442878 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmcp7\" (UniqueName: \"kubernetes.io/projected/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-kube-api-access-lmcp7\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.457715 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcl7z\" (UniqueName: \"kubernetes.io/projected/6cf86133-a9ef-4a8b-a957-ef8e588b200e-kube-api-access-kcl7z\") pod \"horizon-operator-controller-manager-5b9b8895d5-j4hbq\" (UID: \"6cf86133-a9ef-4a8b-a957-ef8e588b200e\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.460894 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.461961 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.470286 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ntw7j" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.478772 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.481943 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.499686 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps7hc\" (UniqueName: \"kubernetes.io/projected/5b458e63-ce2e-4d37-9509-5b31170d932f-kube-api-access-ps7hc\") pod \"mariadb-operator-controller-manager-6994f66f48-6s7s5\" (UID: \"5b458e63-ce2e-4d37-9509-5b31170d932f\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.499752 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hjbp\" (UniqueName: \"kubernetes.io/projected/9a7b2bf7-fab5-4634-9dfa-147dc2de21bc-kube-api-access-6hjbp\") pod \"keystone-operator-controller-manager-b4d948c87-2vhq7\" (UID: \"9a7b2bf7-fab5-4634-9dfa-147dc2de21bc\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.499815 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts9sx\" (UniqueName: \"kubernetes.io/projected/c07a7a9d-d976-4d10-af1d-b92b5da76d71-kube-api-access-ts9sx\") pod \"neutron-operator-controller-manager-6bd4687957-pxnr7\" (UID: \"c07a7a9d-d976-4d10-af1d-b92b5da76d71\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.499857 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svrbl\" (UniqueName: \"kubernetes.io/projected/0279e1a1-c275-48e8-815c-0afae718b93a-kube-api-access-svrbl\") pod \"manila-operator-controller-manager-67d996989d-25sql\" (UID: \"0279e1a1-c275-48e8-815c-0afae718b93a\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.500164 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rnq\" (UniqueName: \"kubernetes.io/projected/015fdc09-2359-48f1-9800-9d44efc254fc-kube-api-access-z5rnq\") pod \"ironic-operator-controller-manager-554564d7fc-h2tmg\" (UID: \"015fdc09-2359-48f1-9800-9d44efc254fc\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.512924 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.521590 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.522320 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.523256 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.529103 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rsxp5" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.545031 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hjbp\" (UniqueName: \"kubernetes.io/projected/9a7b2bf7-fab5-4634-9dfa-147dc2de21bc-kube-api-access-6hjbp\") pod \"keystone-operator-controller-manager-b4d948c87-2vhq7\" (UID: \"9a7b2bf7-fab5-4634-9dfa-147dc2de21bc\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.545438 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.551685 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.552477 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.553786 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svrbl\" (UniqueName: \"kubernetes.io/projected/0279e1a1-c275-48e8-815c-0afae718b93a-kube-api-access-svrbl\") pod \"manila-operator-controller-manager-67d996989d-25sql\" (UID: \"0279e1a1-c275-48e8-815c-0afae718b93a\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.560596 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rnq\" (UniqueName: \"kubernetes.io/projected/015fdc09-2359-48f1-9800-9d44efc254fc-kube-api-access-z5rnq\") pod \"ironic-operator-controller-manager-554564d7fc-h2tmg\" (UID: \"015fdc09-2359-48f1-9800-9d44efc254fc\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.562214 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-msnf9" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.567545 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.568314 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.573735 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vslsn" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.573770 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.595638 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.601353 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps7hc\" (UniqueName: \"kubernetes.io/projected/5b458e63-ce2e-4d37-9509-5b31170d932f-kube-api-access-ps7hc\") pod \"mariadb-operator-controller-manager-6994f66f48-6s7s5\" (UID: \"5b458e63-ce2e-4d37-9509-5b31170d932f\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.601404 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwsst\" (UniqueName: \"kubernetes.io/projected/ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6-kube-api-access-lwsst\") pod \"nova-operator-controller-manager-567668f5cf-kn6fp\" (UID: \"ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.601494 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts9sx\" (UniqueName: \"kubernetes.io/projected/c07a7a9d-d976-4d10-af1d-b92b5da76d71-kube-api-access-ts9sx\") pod \"neutron-operator-controller-manager-6bd4687957-pxnr7\" (UID: \"c07a7a9d-d976-4d10-af1d-b92b5da76d71\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.623388 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.624286 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.627392 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-ss44j" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.627705 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts9sx\" (UniqueName: \"kubernetes.io/projected/c07a7a9d-d976-4d10-af1d-b92b5da76d71-kube-api-access-ts9sx\") pod \"neutron-operator-controller-manager-6bd4687957-pxnr7\" (UID: \"c07a7a9d-d976-4d10-af1d-b92b5da76d71\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.637307 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.646292 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps7hc\" (UniqueName: \"kubernetes.io/projected/5b458e63-ce2e-4d37-9509-5b31170d932f-kube-api-access-ps7hc\") pod \"mariadb-operator-controller-manager-6994f66f48-6s7s5\" (UID: \"5b458e63-ce2e-4d37-9509-5b31170d932f\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.646532 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.652452 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.653264 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.661027 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.672189 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-n9hqs" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.678709 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.679243 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.689944 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.694034 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.699995 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.701308 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.703268 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qjpns" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.704325 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwsst\" (UniqueName: \"kubernetes.io/projected/ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6-kube-api-access-lwsst\") pod \"nova-operator-controller-manager-567668f5cf-kn6fp\" (UID: \"ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.704432 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwjm4\" (UniqueName: \"kubernetes.io/projected/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-kube-api-access-lwjm4\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.704470 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q4xr\" (UniqueName: \"kubernetes.io/projected/37d48839-36c8-4a2c-ac3d-a4e5394b11eb-kube-api-access-8q4xr\") pod \"octavia-operator-controller-manager-659dc6bbfc-8fthg\" (UID: \"37d48839-36c8-4a2c-ac3d-a4e5394b11eb\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.704501 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgrk\" (UniqueName: \"kubernetes.io/projected/07870810-90ed-47a5-90f5-b684700f7092-kube-api-access-pwgrk\") pod \"ovn-operator-controller-manager-5955d8c787-lgqlc\" (UID: \"07870810-90ed-47a5-90f5-b684700f7092\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.704524 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.726190 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.736638 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwsst\" (UniqueName: \"kubernetes.io/projected/ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6-kube-api-access-lwsst\") pod \"nova-operator-controller-manager-567668f5cf-kn6fp\" (UID: \"ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.737481 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.742654 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.754220 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.755996 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.767307 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-h584z" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.767749 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.781284 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.782237 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.789224 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5lm4q" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.795484 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.809440 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2pf\" (UniqueName: \"kubernetes.io/projected/cf5974e9-29dc-4274-8f65-9cf82450bdfc-kube-api-access-wf2pf\") pod \"telemetry-operator-controller-manager-589c568786-t2ncn\" (UID: \"cf5974e9-29dc-4274-8f65-9cf82450bdfc\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.809517 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2n2f\" (UniqueName: \"kubernetes.io/projected/01823ef1-1bcc-49f8-8cbc-37db7edc9fd0-kube-api-access-z2n2f\") pod \"swift-operator-controller-manager-68f46476f-mvqqg\" (UID: \"01823ef1-1bcc-49f8-8cbc-37db7edc9fd0\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.809561 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwjm4\" (UniqueName: \"kubernetes.io/projected/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-kube-api-access-lwjm4\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.809582 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q4xr\" (UniqueName: \"kubernetes.io/projected/37d48839-36c8-4a2c-ac3d-a4e5394b11eb-kube-api-access-8q4xr\") pod \"octavia-operator-controller-manager-659dc6bbfc-8fthg\" (UID: \"37d48839-36c8-4a2c-ac3d-a4e5394b11eb\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.809605 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgrk\" (UniqueName: \"kubernetes.io/projected/07870810-90ed-47a5-90f5-b684700f7092-kube-api-access-pwgrk\") pod \"ovn-operator-controller-manager-5955d8c787-lgqlc\" (UID: \"07870810-90ed-47a5-90f5-b684700f7092\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.809622 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.809660 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj7xq\" (UniqueName: \"kubernetes.io/projected/4b18c8c4-1868-4383-b2d7-d9b3c9a33e03-kube-api-access-dj7xq\") pod \"placement-operator-controller-manager-8497b45c89-v8c26\" (UID: \"4b18c8c4-1868-4383-b2d7-d9b3c9a33e03\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" Feb 25 11:09:27 crc kubenswrapper[4725]: E0225 11:09:27.810235 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:27 crc kubenswrapper[4725]: E0225 11:09:27.810277 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert podName:2fbb069d-66ce-4d87-9fcb-f82181bd85e9 nodeName:}" failed. No retries permitted until 2026-02-25 11:09:28.310264732 +0000 UTC m=+993.808846757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" (UID: "2fbb069d-66ce-4d87-9fcb-f82181bd85e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.825996 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.837813 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q4xr\" (UniqueName: \"kubernetes.io/projected/37d48839-36c8-4a2c-ac3d-a4e5394b11eb-kube-api-access-8q4xr\") pod \"octavia-operator-controller-manager-659dc6bbfc-8fthg\" (UID: \"37d48839-36c8-4a2c-ac3d-a4e5394b11eb\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.840502 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwjm4\" (UniqueName: \"kubernetes.io/projected/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-kube-api-access-lwjm4\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.840565 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.841519 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.854953 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.855126 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.855142 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgrk\" (UniqueName: \"kubernetes.io/projected/07870810-90ed-47a5-90f5-b684700f7092-kube-api-access-pwgrk\") pod \"ovn-operator-controller-manager-5955d8c787-lgqlc\" (UID: \"07870810-90ed-47a5-90f5-b684700f7092\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.855603 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-66v9f" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.865397 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.865639 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.886602 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.887451 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.890153 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-687lx" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.890495 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.911202 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2pf\" (UniqueName: \"kubernetes.io/projected/cf5974e9-29dc-4274-8f65-9cf82450bdfc-kube-api-access-wf2pf\") pod \"telemetry-operator-controller-manager-589c568786-t2ncn\" (UID: \"cf5974e9-29dc-4274-8f65-9cf82450bdfc\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.911286 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2n2f\" (UniqueName: \"kubernetes.io/projected/01823ef1-1bcc-49f8-8cbc-37db7edc9fd0-kube-api-access-z2n2f\") pod \"swift-operator-controller-manager-68f46476f-mvqqg\" (UID: \"01823ef1-1bcc-49f8-8cbc-37db7edc9fd0\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.911319 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.911383 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhzwc\" (UniqueName: \"kubernetes.io/projected/e1b06e72-2952-4eee-9732-af05abc6a117-kube-api-access-fhzwc\") pod \"watcher-operator-controller-manager-bccc79885-6lfbp\" (UID: \"e1b06e72-2952-4eee-9732-af05abc6a117\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.911411 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdjq\" (UniqueName: \"kubernetes.io/projected/2b257035-93ff-456f-8aaa-e370a1756b0e-kube-api-access-vtdjq\") pod \"test-operator-controller-manager-5dc6794d5b-8gchs\" (UID: \"2b257035-93ff-456f-8aaa-e370a1756b0e\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.911432 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj7xq\" (UniqueName: \"kubernetes.io/projected/4b18c8c4-1868-4383-b2d7-d9b3c9a33e03-kube-api-access-dj7xq\") pod \"placement-operator-controller-manager-8497b45c89-v8c26\" (UID: \"4b18c8c4-1868-4383-b2d7-d9b3c9a33e03\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" Feb 25 11:09:27 crc kubenswrapper[4725]: E0225 11:09:27.911905 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:27 crc kubenswrapper[4725]: E0225 11:09:27.911942 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert podName:b82c26d2-a08f-4c57-a876-9ac8a87c1fcf nodeName:}" failed. No retries permitted until 2026-02-25 11:09:28.911929702 +0000 UTC m=+994.410511727 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert") pod "infra-operator-controller-manager-79d975b745-6872z" (UID: "b82c26d2-a08f-4c57-a876-9ac8a87c1fcf") : secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.913886 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.934811 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2n2f\" (UniqueName: \"kubernetes.io/projected/01823ef1-1bcc-49f8-8cbc-37db7edc9fd0-kube-api-access-z2n2f\") pod \"swift-operator-controller-manager-68f46476f-mvqqg\" (UID: \"01823ef1-1bcc-49f8-8cbc-37db7edc9fd0\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.938711 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv"] Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.942863 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj7xq\" (UniqueName: \"kubernetes.io/projected/4b18c8c4-1868-4383-b2d7-d9b3c9a33e03-kube-api-access-dj7xq\") pod \"placement-operator-controller-manager-8497b45c89-v8c26\" (UID: \"4b18c8c4-1868-4383-b2d7-d9b3c9a33e03\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.942969 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2pf\" (UniqueName: \"kubernetes.io/projected/cf5974e9-29dc-4274-8f65-9cf82450bdfc-kube-api-access-wf2pf\") pod \"telemetry-operator-controller-manager-589c568786-t2ncn\" (UID: \"cf5974e9-29dc-4274-8f65-9cf82450bdfc\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.966165 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" Feb 25 11:09:27 crc kubenswrapper[4725]: I0225 11:09:27.976875 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.015953 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mng7b\" (UniqueName: \"kubernetes.io/projected/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-kube-api-access-mng7b\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.016005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdjq\" (UniqueName: \"kubernetes.io/projected/2b257035-93ff-456f-8aaa-e370a1756b0e-kube-api-access-vtdjq\") pod \"test-operator-controller-manager-5dc6794d5b-8gchs\" (UID: \"2b257035-93ff-456f-8aaa-e370a1756b0e\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.016029 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.016073 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.016118 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrgp9\" (UniqueName: \"kubernetes.io/projected/9921b017-bf1b-457d-b9ec-b344b0fabd1c-kube-api-access-hrgp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bzx24\" (UID: \"9921b017-bf1b-457d-b9ec-b344b0fabd1c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.016191 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhzwc\" (UniqueName: \"kubernetes.io/projected/e1b06e72-2952-4eee-9732-af05abc6a117-kube-api-access-fhzwc\") pod \"watcher-operator-controller-manager-bccc79885-6lfbp\" (UID: \"e1b06e72-2952-4eee-9732-af05abc6a117\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.031085 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.033994 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdjq\" (UniqueName: \"kubernetes.io/projected/2b257035-93ff-456f-8aaa-e370a1756b0e-kube-api-access-vtdjq\") pod \"test-operator-controller-manager-5dc6794d5b-8gchs\" (UID: \"2b257035-93ff-456f-8aaa-e370a1756b0e\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.037245 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhzwc\" (UniqueName: \"kubernetes.io/projected/e1b06e72-2952-4eee-9732-af05abc6a117-kube-api-access-fhzwc\") pod \"watcher-operator-controller-manager-bccc79885-6lfbp\" (UID: \"e1b06e72-2952-4eee-9732-af05abc6a117\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.056745 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.112451 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.122882 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrgp9\" (UniqueName: \"kubernetes.io/projected/9921b017-bf1b-457d-b9ec-b344b0fabd1c-kube-api-access-hrgp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bzx24\" (UID: \"9921b017-bf1b-457d-b9ec-b344b0fabd1c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.122979 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mng7b\" (UniqueName: \"kubernetes.io/projected/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-kube-api-access-mng7b\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.123008 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.123043 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.123149 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.123199 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:28.623182093 +0000 UTC m=+994.121764118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "metrics-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.123860 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.123887 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:28.623878842 +0000 UTC m=+994.122460867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "webhook-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.137726 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.150644 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mng7b\" (UniqueName: \"kubernetes.io/projected/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-kube-api-access-mng7b\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.156794 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrgp9\" (UniqueName: \"kubernetes.io/projected/9921b017-bf1b-457d-b9ec-b344b0fabd1c-kube-api-access-hrgp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bzx24\" (UID: \"9921b017-bf1b-457d-b9ec-b344b0fabd1c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.230114 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.253669 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-l278b"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.270453 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.278641 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.284422 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22854bfa_3684_4750_b2f7_e5ccbe3e92fb.slice/crio-bba114103970985a37c9d1189c74bf9dad21af2f13b2b250294d3736c77e3d7e WatchSource:0}: Error finding container bba114103970985a37c9d1189c74bf9dad21af2f13b2b250294d3736c77e3d7e: Status 404 returned error can't find the container with id bba114103970985a37c9d1189c74bf9dad21af2f13b2b250294d3736c77e3d7e Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.297415 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41775582_fd78_4c34_93fc_60b9cdc55a2c.slice/crio-879c71cf29c2e477e3fba133d2f4617a71b6c252a3e20494c9552dbde954b6a0 WatchSource:0}: Error finding container 879c71cf29c2e477e3fba133d2f4617a71b6c252a3e20494c9552dbde954b6a0: Status 404 returned error can't find the container with id 879c71cf29c2e477e3fba133d2f4617a71b6c252a3e20494c9552dbde954b6a0 Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.327957 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.328102 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.328153 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert podName:2fbb069d-66ce-4d87-9fcb-f82181bd85e9 nodeName:}" failed. No retries permitted until 2026-02-25 11:09:29.328138987 +0000 UTC m=+994.826721012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" (UID: "2fbb069d-66ce-4d87-9fcb-f82181bd85e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.379742 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" event={"ID":"22854bfa-3684-4750-b2f7-e5ccbe3e92fb","Type":"ContainerStarted","Data":"bba114103970985a37c9d1189c74bf9dad21af2f13b2b250294d3736c77e3d7e"} Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.408108 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" event={"ID":"27540507-aac9-4fd2-84a9-34a2a20885d7","Type":"ContainerStarted","Data":"1dbcc5ee520828ef6529cf0e4ead9615035182345d9fb159791cf225dd7d66d4"} Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.425791 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" event={"ID":"a897851d-6b6d-40e1-82f2-ef4db97b19d9","Type":"ContainerStarted","Data":"7f885f7cf48e389abfe90d54f3fb398e9bf7fbe68404f6927e4e8221194bf614"} Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.433083 4725 generic.go:334] "Generic (PLEG): container finished" podID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerID="01d8c6acd95d92f3a57b8d3bbeca900756919e025be6adfdd5c87c449dc73f6e" exitCode=0 Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.432377 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.434449 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhxt8" event={"ID":"c0cca41d-a3cc-4060-becc-ba00a60dd9bc","Type":"ContainerDied","Data":"01d8c6acd95d92f3a57b8d3bbeca900756919e025be6adfdd5c87c449dc73f6e"} Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.436103 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" event={"ID":"41775582-fd78-4c34-93fc-60b9cdc55a2c","Type":"ContainerStarted","Data":"879c71cf29c2e477e3fba133d2f4617a71b6c252a3e20494c9552dbde954b6a0"} Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.442084 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" event={"ID":"0755d178-0ceb-41f1-a26c-e96e466f8300","Type":"ContainerStarted","Data":"9bbe29b5615e053bb44dcbdf7f2043a39cebe3622388502e6d30c10292d044b4"} Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.471777 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.476525 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.481254 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-25sql"] Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.490821 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a7b2bf7_fab5_4634_9dfa_147dc2de21bc.slice/crio-aef623ade50fa6ad338f3280a7c26c80d188144ee760459cdb4557a2768219f1 WatchSource:0}: Error finding container aef623ade50fa6ad338f3280a7c26c80d188144ee760459cdb4557a2768219f1: Status 404 returned error can't find the container with id aef623ade50fa6ad338f3280a7c26c80d188144ee760459cdb4557a2768219f1 Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.491480 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015fdc09_2359_48f1_9800_9d44efc254fc.slice/crio-390eb6b79f01d02219ce7757522418e1458d7ac23f04cd302cd2c8679625e50a WatchSource:0}: Error finding container 390eb6b79f01d02219ce7757522418e1458d7ac23f04cd302cd2c8679625e50a: Status 404 returned error can't find the container with id 390eb6b79f01d02219ce7757522418e1458d7ac23f04cd302cd2c8679625e50a Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.494044 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0279e1a1_c275_48e8_815c_0afae718b93a.slice/crio-fc942324409c0ec0c720b511a4bbfc8a1535ae41bc1b65ea7f0958e63164769a WatchSource:0}: Error finding container fc942324409c0ec0c720b511a4bbfc8a1535ae41bc1b65ea7f0958e63164769a: Status 404 returned error can't find the container with id fc942324409c0ec0c720b511a4bbfc8a1535ae41bc1b65ea7f0958e63164769a Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.605254 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.632134 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.632190 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.632379 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.632448 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:29.632433158 +0000 UTC m=+995.131015183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "metrics-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.632491 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.632509 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:29.63250309 +0000 UTC m=+995.131085115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "webhook-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.726330 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.746024 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.777369 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26"] Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.793601 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b18c8c4_1868_4383_b2d7_d9b3c9a33e03.slice/crio-5fc3d2e7657f42cf697d2487444ad446feee348afe5d98cfd87992ec5c8c785d WatchSource:0}: Error finding container 5fc3d2e7657f42cf697d2487444ad446feee348afe5d98cfd87992ec5c8c785d: Status 404 returned error can't find the container with id 5fc3d2e7657f42cf697d2487444ad446feee348afe5d98cfd87992ec5c8c785d Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.798392 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dj7xq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-v8c26_openstack-operators(4b18c8c4-1868-4383-b2d7-d9b3c9a33e03): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.799557 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" podUID="4b18c8c4-1868-4383-b2d7-d9b3c9a33e03" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.879777 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc"] Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.896710 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pwgrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5955d8c787-lgqlc_openstack-operators(07870810-90ed-47a5-90f5-b684700f7092): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.898201 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" podUID="07870810-90ed-47a5-90f5-b684700f7092" Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.904011 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5974e9_29dc_4274_8f65_9cf82450bdfc.slice/crio-f51f4aecaa4d44772c76350a38e0eb1437322211d265b4b0d3ee93745b3ad4e6 WatchSource:0}: Error finding container f51f4aecaa4d44772c76350a38e0eb1437322211d265b4b0d3ee93745b3ad4e6: Status 404 returned error can't find the container with id f51f4aecaa4d44772c76350a38e0eb1437322211d265b4b0d3ee93745b3ad4e6 Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.911024 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn"] Feb 25 11:09:28 crc kubenswrapper[4725]: W0225 11:09:28.912749 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba6741a0_f2ce_464b_aaa4_eafa6f4f0eb6.slice/crio-30ae3b26a60521133145c12a1806a0ecff75d1b1977b2e16fa6bf890587d93f9 WatchSource:0}: Error finding container 30ae3b26a60521133145c12a1806a0ecff75d1b1977b2e16fa6bf890587d93f9: Status 404 returned error can't find the container with id 30ae3b26a60521133145c12a1806a0ecff75d1b1977b2e16fa6bf890587d93f9 Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.912960 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wf2pf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-589c568786-t2ncn_openstack-operators(cf5974e9-29dc-4274-8f65-9cf82450bdfc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.914876 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" podUID="cf5974e9-29dc-4274-8f65-9cf82450bdfc" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.918497 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg"] Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.919628 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lwsst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-kn6fp_openstack-operators(ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.920850 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" podUID="ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6" Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.926652 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp"] Feb 25 11:09:28 crc kubenswrapper[4725]: I0225 11:09:28.948680 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.948809 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:28 crc kubenswrapper[4725]: E0225 11:09:28.948872 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert podName:b82c26d2-a08f-4c57-a876-9ac8a87c1fcf nodeName:}" failed. No retries permitted until 2026-02-25 11:09:30.948859443 +0000 UTC m=+996.447441468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert") pod "infra-operator-controller-manager-79d975b745-6872z" (UID: "b82c26d2-a08f-4c57-a876-9ac8a87c1fcf") : secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.045181 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp"] Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.051156 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs"] Feb 25 11:09:29 crc kubenswrapper[4725]: W0225 11:09:29.069050 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b257035_93ff_456f_8aaa_e370a1756b0e.slice/crio-5605eb37ad9e642d2e5bf10c6e2927a910c8d67a59e20d52b21d516f90892acc WatchSource:0}: Error finding container 5605eb37ad9e642d2e5bf10c6e2927a910c8d67a59e20d52b21d516f90892acc: Status 404 returned error can't find the container with id 5605eb37ad9e642d2e5bf10c6e2927a910c8d67a59e20d52b21d516f90892acc Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.071626 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vtdjq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-8gchs_openstack-operators(2b257035-93ff-456f-8aaa-e370a1756b0e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.075185 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" podUID="2b257035-93ff-456f-8aaa-e370a1756b0e" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.084694 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24"] Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.085389 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrgp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-bzx24_openstack-operators(9921b017-bf1b-457d-b9ec-b344b0fabd1c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.086516 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" podUID="9921b017-bf1b-457d-b9ec-b344b0fabd1c" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.363742 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.363922 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.363981 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert podName:2fbb069d-66ce-4d87-9fcb-f82181bd85e9 nodeName:}" failed. No retries permitted until 2026-02-25 11:09:31.363953627 +0000 UTC m=+996.862535652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" (UID: "2fbb069d-66ce-4d87-9fcb-f82181bd85e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.557526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" event={"ID":"9a7b2bf7-fab5-4634-9dfa-147dc2de21bc","Type":"ContainerStarted","Data":"aef623ade50fa6ad338f3280a7c26c80d188144ee760459cdb4557a2768219f1"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.584359 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" event={"ID":"9921b017-bf1b-457d-b9ec-b344b0fabd1c","Type":"ContainerStarted","Data":"41986b9754465cbd511d6341306a1d56c1511c0a088749281974b7ab64a36cd7"} Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.605390 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" podUID="9921b017-bf1b-457d-b9ec-b344b0fabd1c" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.607117 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" event={"ID":"cf5974e9-29dc-4274-8f65-9cf82450bdfc","Type":"ContainerStarted","Data":"f51f4aecaa4d44772c76350a38e0eb1437322211d265b4b0d3ee93745b3ad4e6"} Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.609322 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" podUID="cf5974e9-29dc-4274-8f65-9cf82450bdfc" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.612566 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" event={"ID":"015fdc09-2359-48f1-9800-9d44efc254fc","Type":"ContainerStarted","Data":"390eb6b79f01d02219ce7757522418e1458d7ac23f04cd302cd2c8679625e50a"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.614349 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" event={"ID":"4b18c8c4-1868-4383-b2d7-d9b3c9a33e03","Type":"ContainerStarted","Data":"5fc3d2e7657f42cf697d2487444ad446feee348afe5d98cfd87992ec5c8c785d"} Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.615550 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" podUID="4b18c8c4-1868-4383-b2d7-d9b3c9a33e03" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.618039 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhxt8" event={"ID":"c0cca41d-a3cc-4060-becc-ba00a60dd9bc","Type":"ContainerStarted","Data":"8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.649448 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" event={"ID":"6cf86133-a9ef-4a8b-a957-ef8e588b200e","Type":"ContainerStarted","Data":"8b5357b7fe88cac00e86cd8bae40e199e15ffe0b1b7b595531ec3c53bec12be5"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.650681 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" event={"ID":"37d48839-36c8-4a2c-ac3d-a4e5394b11eb","Type":"ContainerStarted","Data":"0334b6eaf06f9962fa63eaf3ab41cdc517ae2b57e12994e2587ceea24bf9be4c"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.651743 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" event={"ID":"0279e1a1-c275-48e8-815c-0afae718b93a","Type":"ContainerStarted","Data":"fc942324409c0ec0c720b511a4bbfc8a1535ae41bc1b65ea7f0958e63164769a"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.653605 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" event={"ID":"2b257035-93ff-456f-8aaa-e370a1756b0e","Type":"ContainerStarted","Data":"5605eb37ad9e642d2e5bf10c6e2927a910c8d67a59e20d52b21d516f90892acc"} Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.666218 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" podUID="2b257035-93ff-456f-8aaa-e370a1756b0e" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.666597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" event={"ID":"01823ef1-1bcc-49f8-8cbc-37db7edc9fd0","Type":"ContainerStarted","Data":"ff50b9da7519cc34e7728d58acba614f8ae935d798d124b60a980b3da2529bcc"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.670985 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" event={"ID":"07870810-90ed-47a5-90f5-b684700f7092","Type":"ContainerStarted","Data":"594d87fc807dea8b98c41b019c2e98e8da9378af9700a242943039f3ae233abc"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.672170 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.672215 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.672314 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.672347 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:31.672334958 +0000 UTC m=+997.170916983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "metrics-server-cert" not found Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.672426 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.672483 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:31.672475172 +0000 UTC m=+997.171057197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "webhook-server-cert" not found Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.679671 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" podUID="07870810-90ed-47a5-90f5-b684700f7092" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.684358 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" event={"ID":"c07a7a9d-d976-4d10-af1d-b92b5da76d71","Type":"ContainerStarted","Data":"5f9bfc3ba618ba3795bfffef6180b590d39eb5de355a9094ff8b120c3826665a"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.687984 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" event={"ID":"ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6","Type":"ContainerStarted","Data":"30ae3b26a60521133145c12a1806a0ecff75d1b1977b2e16fa6bf890587d93f9"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.689611 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" event={"ID":"5b458e63-ce2e-4d37-9509-5b31170d932f","Type":"ContainerStarted","Data":"81135c15b91511f0f5d81197148e66205f1429742f5c8e42b89d3d87e97cc163"} Feb 25 11:09:29 crc kubenswrapper[4725]: E0225 11:09:29.690802 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" podUID="ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6" Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.690891 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" event={"ID":"e1b06e72-2952-4eee-9732-af05abc6a117","Type":"ContainerStarted","Data":"970f6ba5f3d4b72adfd498591bc5236ff41e8d4bf21feee44a7e295ba5750d3a"} Feb 25 11:09:29 crc kubenswrapper[4725]: I0225 11:09:29.757905 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zhxt8" podStartSLOduration=3.218319151 podStartE2EDuration="5.757888259s" podCreationTimestamp="2026-02-25 11:09:24 +0000 UTC" firstStartedPulling="2026-02-25 11:09:26.346927333 +0000 UTC m=+991.845509348" lastFinishedPulling="2026-02-25 11:09:28.886496441 +0000 UTC m=+994.385078456" observedRunningTime="2026-02-25 11:09:29.755203847 +0000 UTC m=+995.253785872" watchObservedRunningTime="2026-02-25 11:09:29.757888259 +0000 UTC m=+995.256470294" Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.714275 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" podUID="ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6" Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.715115 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" podUID="2b257035-93ff-456f-8aaa-e370a1756b0e" Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.715176 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" podUID="4b18c8c4-1868-4383-b2d7-d9b3c9a33e03" Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.715182 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" podUID="cf5974e9-29dc-4274-8f65-9cf82450bdfc" Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.715214 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" podUID="07870810-90ed-47a5-90f5-b684700f7092" Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.715245 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" podUID="9921b017-bf1b-457d-b9ec-b344b0fabd1c" Feb 25 11:09:30 crc kubenswrapper[4725]: I0225 11:09:30.996657 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.996856 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:30 crc kubenswrapper[4725]: E0225 11:09:30.996932 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert podName:b82c26d2-a08f-4c57-a876-9ac8a87c1fcf nodeName:}" failed. No retries permitted until 2026-02-25 11:09:34.996913187 +0000 UTC m=+1000.495495262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert") pod "infra-operator-controller-manager-79d975b745-6872z" (UID: "b82c26d2-a08f-4c57-a876-9ac8a87c1fcf") : secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:31 crc kubenswrapper[4725]: I0225 11:09:31.403409 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:31 crc kubenswrapper[4725]: E0225 11:09:31.405424 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:31 crc kubenswrapper[4725]: E0225 11:09:31.405468 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert podName:2fbb069d-66ce-4d87-9fcb-f82181bd85e9 nodeName:}" failed. No retries permitted until 2026-02-25 11:09:35.405453448 +0000 UTC m=+1000.904035473 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" (UID: "2fbb069d-66ce-4d87-9fcb-f82181bd85e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:31 crc kubenswrapper[4725]: I0225 11:09:31.711037 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:31 crc kubenswrapper[4725]: I0225 11:09:31.711094 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:31 crc kubenswrapper[4725]: E0225 11:09:31.711200 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 11:09:31 crc kubenswrapper[4725]: E0225 11:09:31.711245 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:35.711231479 +0000 UTC m=+1001.209813504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "metrics-server-cert" not found Feb 25 11:09:31 crc kubenswrapper[4725]: E0225 11:09:31.711701 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 11:09:31 crc kubenswrapper[4725]: E0225 11:09:31.711733 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:35.711723352 +0000 UTC m=+1001.210305377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "webhook-server-cert" not found Feb 25 11:09:34 crc kubenswrapper[4725]: I0225 11:09:34.929892 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:34 crc kubenswrapper[4725]: I0225 11:09:34.930448 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:34 crc kubenswrapper[4725]: I0225 11:09:34.972855 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:35 crc kubenswrapper[4725]: I0225 11:09:35.060139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.061131 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.061182 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert podName:b82c26d2-a08f-4c57-a876-9ac8a87c1fcf nodeName:}" failed. No retries permitted until 2026-02-25 11:09:43.061168528 +0000 UTC m=+1008.559750543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert") pod "infra-operator-controller-manager-79d975b745-6872z" (UID: "b82c26d2-a08f-4c57-a876-9ac8a87c1fcf") : secret "infra-operator-webhook-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: I0225 11:09:35.465715 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.465807 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.465907 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert podName:2fbb069d-66ce-4d87-9fcb-f82181bd85e9 nodeName:}" failed. No retries permitted until 2026-02-25 11:09:43.465892937 +0000 UTC m=+1008.964474962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" (UID: "2fbb069d-66ce-4d87-9fcb-f82181bd85e9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: I0225 11:09:35.770718 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:35 crc kubenswrapper[4725]: I0225 11:09:35.770865 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.770933 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.771012 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:43.7709905 +0000 UTC m=+1009.269572525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "webhook-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.771066 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: E0225 11:09:35.771151 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs podName:b6b802f9-7adb-43ca-b8ae-de7bacb908fb nodeName:}" failed. No retries permitted until 2026-02-25 11:09:43.771127993 +0000 UTC m=+1009.269710048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs") pod "openstack-operator-controller-manager-7489bcf59c-kb5pq" (UID: "b6b802f9-7adb-43ca-b8ae-de7bacb908fb") : secret "metrics-server-cert" not found Feb 25 11:09:35 crc kubenswrapper[4725]: I0225 11:09:35.824358 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:35 crc kubenswrapper[4725]: I0225 11:09:35.863028 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhxt8"] Feb 25 11:09:37 crc kubenswrapper[4725]: I0225 11:09:37.760797 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zhxt8" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="registry-server" containerID="cri-o://8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237" gracePeriod=2 Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.305465 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66gx8"] Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.308667 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.324930 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66gx8"] Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.424993 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2nzw\" (UniqueName: \"kubernetes.io/projected/5c09b8a8-b815-45fb-9ef1-8e78844135cc-kube-api-access-j2nzw\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.425044 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-utilities\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.425083 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-catalog-content\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.526570 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-catalog-content\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.526756 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2nzw\" (UniqueName: \"kubernetes.io/projected/5c09b8a8-b815-45fb-9ef1-8e78844135cc-kube-api-access-j2nzw\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.526788 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-utilities\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.527182 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-catalog-content\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.527272 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-utilities\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.554236 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2nzw\" (UniqueName: \"kubernetes.io/projected/5c09b8a8-b815-45fb-9ef1-8e78844135cc-kube-api-access-j2nzw\") pod \"redhat-marketplace-66gx8\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.630480 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.778103 4725 generic.go:334] "Generic (PLEG): container finished" podID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerID="8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237" exitCode=0 Feb 25 11:09:39 crc kubenswrapper[4725]: I0225 11:09:39.778141 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhxt8" event={"ID":"c0cca41d-a3cc-4060-becc-ba00a60dd9bc","Type":"ContainerDied","Data":"8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237"} Feb 25 11:09:41 crc kubenswrapper[4725]: I0225 11:09:41.555687 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:09:41 crc kubenswrapper[4725]: I0225 11:09:41.556250 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.109866 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.120704 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82c26d2-a08f-4c57-a876-9ac8a87c1fcf-cert\") pod \"infra-operator-controller-manager-79d975b745-6872z\" (UID: \"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.407761 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.516560 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.520885 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbb069d-66ce-4d87-9fcb-f82181bd85e9-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd\" (UID: \"2fbb069d-66ce-4d87-9fcb-f82181bd85e9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:43 crc kubenswrapper[4725]: E0225 11:09:43.689452 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc" Feb 25 11:09:43 crc kubenswrapper[4725]: E0225 11:09:43.689967 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n8vhn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-868647ff47-l278b_openstack-operators(41775582-fd78-4c34-93fc-60b9cdc55a2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:43 crc kubenswrapper[4725]: E0225 11:09:43.691137 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" podUID="41775582-fd78-4c34-93fc-60b9cdc55a2c" Feb 25 11:09:43 crc kubenswrapper[4725]: E0225 11:09:43.807631 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" podUID="41775582-fd78-4c34-93fc-60b9cdc55a2c" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.807967 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.821260 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.821319 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.825048 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-webhook-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:43 crc kubenswrapper[4725]: I0225 11:09:43.825660 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6b802f9-7adb-43ca-b8ae-de7bacb908fb-metrics-certs\") pod \"openstack-operator-controller-manager-7489bcf59c-kb5pq\" (UID: \"b6b802f9-7adb-43ca-b8ae-de7bacb908fb\") " pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:44 crc kubenswrapper[4725]: I0225 11:09:44.097953 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.340513 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.340730 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhzwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-6lfbp_openstack-operators(e1b06e72-2952-4eee-9732-af05abc6a117): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.341949 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" podUID="e1b06e72-2952-4eee-9732-af05abc6a117" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.812612 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" podUID="e1b06e72-2952-4eee-9732-af05abc6a117" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.893984 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.894150 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8q4xr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-8fthg_openstack-operators(37d48839-36c8-4a2c-ac3d-a4e5394b11eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.895359 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" podUID="37d48839-36c8-4a2c-ac3d-a4e5394b11eb" Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.930377 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237 is running failed: container process not found" containerID="8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.930842 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237 is running failed: container process not found" containerID="8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.931241 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237 is running failed: container process not found" containerID="8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 11:09:44 crc kubenswrapper[4725]: E0225 11:09:44.931318 4725 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-zhxt8" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="registry-server" Feb 25 11:09:45 crc kubenswrapper[4725]: E0225 11:09:45.369274 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04" Feb 25 11:09:45 crc kubenswrapper[4725]: E0225 11:09:45.369469 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2n2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-mvqqg_openstack-operators(01823ef1-1bcc-49f8-8cbc-37db7edc9fd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:45 crc kubenswrapper[4725]: E0225 11:09:45.370581 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" podUID="01823ef1-1bcc-49f8-8cbc-37db7edc9fd0" Feb 25 11:09:45 crc kubenswrapper[4725]: E0225 11:09:45.818045 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" podUID="37d48839-36c8-4a2c-ac3d-a4e5394b11eb" Feb 25 11:09:45 crc kubenswrapper[4725]: E0225 11:09:45.818164 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" podUID="01823ef1-1bcc-49f8-8cbc-37db7edc9fd0" Feb 25 11:09:46 crc kubenswrapper[4725]: E0225 11:09:46.900956 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf" Feb 25 11:09:46 crc kubenswrapper[4725]: E0225 11:09:46.901124 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ts9sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-pxnr7_openstack-operators(c07a7a9d-d976-4d10-af1d-b92b5da76d71): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:46 crc kubenswrapper[4725]: E0225 11:09:46.902442 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" podUID="c07a7a9d-d976-4d10-af1d-b92b5da76d71" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.381493 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.381667 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kcl7z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-j4hbq_openstack-operators(6cf86133-a9ef-4a8b-a957-ef8e588b200e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.382746 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" podUID="6cf86133-a9ef-4a8b-a957-ef8e588b200e" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.837604 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" podUID="c07a7a9d-d976-4d10-af1d-b92b5da76d71" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.837813 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" podUID="6cf86133-a9ef-4a8b-a957-ef8e588b200e" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.964906 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.965427 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-svrbl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-25sql_openstack-operators(0279e1a1-c275-48e8-815c-0afae718b93a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:47 crc kubenswrapper[4725]: E0225 11:09:47.966837 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" podUID="0279e1a1-c275-48e8-815c-0afae718b93a" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.005424 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.079206 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rzx\" (UniqueName: \"kubernetes.io/projected/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-kube-api-access-j6rzx\") pod \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.079268 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-catalog-content\") pod \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.079312 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-utilities\") pod \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\" (UID: \"c0cca41d-a3cc-4060-becc-ba00a60dd9bc\") " Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.080384 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-utilities" (OuterVolumeSpecName: "utilities") pod "c0cca41d-a3cc-4060-becc-ba00a60dd9bc" (UID: "c0cca41d-a3cc-4060-becc-ba00a60dd9bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.086291 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-kube-api-access-j6rzx" (OuterVolumeSpecName: "kube-api-access-j6rzx") pod "c0cca41d-a3cc-4060-becc-ba00a60dd9bc" (UID: "c0cca41d-a3cc-4060-becc-ba00a60dd9bc"). InnerVolumeSpecName "kube-api-access-j6rzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.130146 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0cca41d-a3cc-4060-becc-ba00a60dd9bc" (UID: "c0cca41d-a3cc-4060-becc-ba00a60dd9bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.180978 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.181019 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rzx\" (UniqueName: \"kubernetes.io/projected/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-kube-api-access-j6rzx\") on node \"crc\" DevicePath \"\"" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.181033 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0cca41d-a3cc-4060-becc-ba00a60dd9bc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.846158 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhxt8" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.847001 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhxt8" event={"ID":"c0cca41d-a3cc-4060-becc-ba00a60dd9bc","Type":"ContainerDied","Data":"1f8035c9d3fc9422fe0d939a32c402b4e96670a482ba045605dba19dc4d4d20d"} Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.847060 4725 scope.go:117] "RemoveContainer" containerID="8a238eb7a896df89ddc287141bbb3e761be3b3e567a7192b10d12adb8e1ca237" Feb 25 11:09:48 crc kubenswrapper[4725]: E0225 11:09:48.847782 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" podUID="0279e1a1-c275-48e8-815c-0afae718b93a" Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.890140 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhxt8"] Feb 25 11:09:48 crc kubenswrapper[4725]: I0225 11:09:48.896986 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zhxt8"] Feb 25 11:09:49 crc kubenswrapper[4725]: I0225 11:09:49.251855 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" path="/var/lib/kubelet/pods/c0cca41d-a3cc-4060-becc-ba00a60dd9bc/volumes" Feb 25 11:09:49 crc kubenswrapper[4725]: E0225 11:09:49.656958 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 25 11:09:49 crc kubenswrapper[4725]: E0225 11:09:49.657617 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6hjbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-2vhq7_openstack-operators(9a7b2bf7-fab5-4634-9dfa-147dc2de21bc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:09:49 crc kubenswrapper[4725]: E0225 11:09:49.659188 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" podUID="9a7b2bf7-fab5-4634-9dfa-147dc2de21bc" Feb 25 11:09:49 crc kubenswrapper[4725]: E0225 11:09:49.853371 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" podUID="9a7b2bf7-fab5-4634-9dfa-147dc2de21bc" Feb 25 11:09:52 crc kubenswrapper[4725]: I0225 11:09:52.733940 4725 scope.go:117] "RemoveContainer" containerID="01d8c6acd95d92f3a57b8d3bbeca900756919e025be6adfdd5c87c449dc73f6e" Feb 25 11:09:52 crc kubenswrapper[4725]: I0225 11:09:52.999502 4725 scope.go:117] "RemoveContainer" containerID="1d4e50a85ac8215f908d97d23d7efdf33c1ac13931a7780aa2d8780b5305210a" Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.304339 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq"] Feb 25 11:09:53 crc kubenswrapper[4725]: W0225 11:09:53.305972 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6b802f9_7adb_43ca_b8ae_de7bacb908fb.slice/crio-1aeec6b949c98e8ba7498c4e57dac6b0d4b607161a3ef0166d7210137fc0db8f WatchSource:0}: Error finding container 1aeec6b949c98e8ba7498c4e57dac6b0d4b607161a3ef0166d7210137fc0db8f: Status 404 returned error can't find the container with id 1aeec6b949c98e8ba7498c4e57dac6b0d4b607161a3ef0166d7210137fc0db8f Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.405678 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-6872z"] Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.425487 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66gx8"] Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.435395 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd"] Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.930346 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" event={"ID":"07870810-90ed-47a5-90f5-b684700f7092","Type":"ContainerStarted","Data":"1696b17c52506e0972bf1a309cb0f6db1561c02436097dd59340dee6da98eecb"} Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.931407 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.946691 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" event={"ID":"ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6","Type":"ContainerStarted","Data":"e1d78a033f9d14dea3085efe65fcef34034b8e00608d7f661fe3cdf50063881c"} Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.946961 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.958574 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" podStartSLOduration=3.089092408 podStartE2EDuration="26.958558942s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.896560289 +0000 UTC m=+994.395142314" lastFinishedPulling="2026-02-25 11:09:52.766026803 +0000 UTC m=+1018.264608848" observedRunningTime="2026-02-25 11:09:53.95623668 +0000 UTC m=+1019.454818705" watchObservedRunningTime="2026-02-25 11:09:53.958558942 +0000 UTC m=+1019.457140967" Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.971078 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" event={"ID":"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf","Type":"ContainerStarted","Data":"ef7093264af2bd2d0e9c19580a72bdbff94e9d6dddb946fd58f928c872fc82f9"} Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.991580 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" event={"ID":"22854bfa-3684-4750-b2f7-e5ccbe3e92fb","Type":"ContainerStarted","Data":"d53f4ec743f5b028016a8c2bb386787ab39df3ae09ce9ea961140877256b7b4e"} Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.992404 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" Feb 25 11:09:53 crc kubenswrapper[4725]: I0225 11:09:53.995299 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" podStartSLOduration=3.036584399 podStartE2EDuration="26.995285811s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.919521981 +0000 UTC m=+994.418104006" lastFinishedPulling="2026-02-25 11:09:52.878223393 +0000 UTC m=+1018.376805418" observedRunningTime="2026-02-25 11:09:53.992193379 +0000 UTC m=+1019.490775414" watchObservedRunningTime="2026-02-25 11:09:53.995285811 +0000 UTC m=+1019.493867836" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.007192 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" event={"ID":"2b257035-93ff-456f-8aaa-e370a1756b0e","Type":"ContainerStarted","Data":"6f0fc71af1f54eccf4b07e8b0a21cccb1f0373ab255f38ff592d44c8f5bc45ac"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.007993 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.018247 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" event={"ID":"4b18c8c4-1868-4383-b2d7-d9b3c9a33e03","Type":"ContainerStarted","Data":"600a462b05c1e23416244d97e50cb636532a728df9b855ef65cf8c2bccf26e9e"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.018755 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.029577 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" podStartSLOduration=6.184801133 podStartE2EDuration="28.029564455s" podCreationTimestamp="2026-02-25 11:09:26 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.297618353 +0000 UTC m=+993.796200378" lastFinishedPulling="2026-02-25 11:09:50.142381645 +0000 UTC m=+1015.640963700" observedRunningTime="2026-02-25 11:09:54.027192622 +0000 UTC m=+1019.525774647" watchObservedRunningTime="2026-02-25 11:09:54.029564455 +0000 UTC m=+1019.528146480" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.036283 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" event={"ID":"cf5974e9-29dc-4274-8f65-9cf82450bdfc","Type":"ContainerStarted","Data":"dd58b5250c308e75a3c953452a88252a9713dcbf6f613dd7a0033d0024772b28"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.037022 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.038738 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerID="3bb3088341be3202cf583e56552012834cb365f99b53c50f59d702883180baed" exitCode=0 Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.038792 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66gx8" event={"ID":"5c09b8a8-b815-45fb-9ef1-8e78844135cc","Type":"ContainerDied","Data":"3bb3088341be3202cf583e56552012834cb365f99b53c50f59d702883180baed"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.038811 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66gx8" event={"ID":"5c09b8a8-b815-45fb-9ef1-8e78844135cc","Type":"ContainerStarted","Data":"ea70d16f23c6a3b7b9bd50ba8ad8398013bf970c14eed60c04c7793efb94c443"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.042597 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" podStartSLOduration=3.349719735 podStartE2EDuration="27.042587772s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:29.071340017 +0000 UTC m=+994.569922042" lastFinishedPulling="2026-02-25 11:09:52.764208034 +0000 UTC m=+1018.262790079" observedRunningTime="2026-02-25 11:09:54.041905854 +0000 UTC m=+1019.540487879" watchObservedRunningTime="2026-02-25 11:09:54.042587772 +0000 UTC m=+1019.541169797" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.049534 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" event={"ID":"27540507-aac9-4fd2-84a9-34a2a20885d7","Type":"ContainerStarted","Data":"2e1eecf1625bb1ad94e0a49ac24415581c7487fd26bfedcb745444dc2eec796c"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.050014 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.061661 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" event={"ID":"b6b802f9-7adb-43ca-b8ae-de7bacb908fb","Type":"ContainerStarted","Data":"64c1f6350413665e5459d917b913a296c1b734c86bc22364c75e64153da4a3e0"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.061706 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" event={"ID":"b6b802f9-7adb-43ca-b8ae-de7bacb908fb","Type":"ContainerStarted","Data":"1aeec6b949c98e8ba7498c4e57dac6b0d4b607161a3ef0166d7210137fc0db8f"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.062995 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.076526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" event={"ID":"a897851d-6b6d-40e1-82f2-ef4db97b19d9","Type":"ContainerStarted","Data":"ca76c30d71b9766ef8c87b29ab60a8e1384edcd73de09e4faeda746eaea15643"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.076651 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.078101 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" event={"ID":"0755d178-0ceb-41f1-a26c-e96e466f8300","Type":"ContainerStarted","Data":"c6f883caa83eb2f4cd1ca457bd5c2cae857e60ff5cd4117c17c686375d1314b9"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.078217 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.091885 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" podStartSLOduration=2.914133615 podStartE2EDuration="27.091872156s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.798255639 +0000 UTC m=+994.296837664" lastFinishedPulling="2026-02-25 11:09:52.97599418 +0000 UTC m=+1018.474576205" observedRunningTime="2026-02-25 11:09:54.073108746 +0000 UTC m=+1019.571690771" watchObservedRunningTime="2026-02-25 11:09:54.091872156 +0000 UTC m=+1019.590454181" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.094590 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" podStartSLOduration=3.167286093 podStartE2EDuration="27.094581958s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.912804962 +0000 UTC m=+994.411386987" lastFinishedPulling="2026-02-25 11:09:52.840100827 +0000 UTC m=+1018.338682852" observedRunningTime="2026-02-25 11:09:54.089152013 +0000 UTC m=+1019.587734038" watchObservedRunningTime="2026-02-25 11:09:54.094581958 +0000 UTC m=+1019.593163983" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.100519 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" event={"ID":"5b458e63-ce2e-4d37-9509-5b31170d932f","Type":"ContainerStarted","Data":"53ad4b48c3ef1cefd712778a14963755d16f711426e46a97e33267f7e0345913"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.101164 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.124024 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" event={"ID":"015fdc09-2359-48f1-9800-9d44efc254fc","Type":"ContainerStarted","Data":"c0dcb90f082152239a4d6130d637c84c6f3f6230b0a179e842db52f77a6a4303"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.124611 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.132266 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" event={"ID":"2fbb069d-66ce-4d87-9fcb-f82181bd85e9","Type":"ContainerStarted","Data":"beb11c9f1f09285512e8128e5bd2d5d8f7bdef432175069e99fd644d7a401e57"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.137017 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" event={"ID":"9921b017-bf1b-457d-b9ec-b344b0fabd1c","Type":"ContainerStarted","Data":"63a86da49df3cea8d7441fc458a51b9597fbf6d85557729bf10959586b32768d"} Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.171034 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" podStartSLOduration=6.06690591 podStartE2EDuration="28.171019805s" podCreationTimestamp="2026-02-25 11:09:26 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.040302054 +0000 UTC m=+993.538884079" lastFinishedPulling="2026-02-25 11:09:50.144415919 +0000 UTC m=+1015.642997974" observedRunningTime="2026-02-25 11:09:54.169501254 +0000 UTC m=+1019.668083269" watchObservedRunningTime="2026-02-25 11:09:54.171019805 +0000 UTC m=+1019.669601830" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.176121 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" podStartSLOduration=27.17610648 podStartE2EDuration="27.17610648s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:09:54.150305842 +0000 UTC m=+1019.648887877" watchObservedRunningTime="2026-02-25 11:09:54.17610648 +0000 UTC m=+1019.674688505" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.225932 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bzx24" podStartSLOduration=3.345170243 podStartE2EDuration="27.225918038s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:29.085285899 +0000 UTC m=+994.583867924" lastFinishedPulling="2026-02-25 11:09:52.966033694 +0000 UTC m=+1018.464615719" observedRunningTime="2026-02-25 11:09:54.222428385 +0000 UTC m=+1019.721010410" watchObservedRunningTime="2026-02-25 11:09:54.225918038 +0000 UTC m=+1019.724500053" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.244148 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" podStartSLOduration=6.470337785 podStartE2EDuration="28.244131984s" podCreationTimestamp="2026-02-25 11:09:26 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.362607566 +0000 UTC m=+993.861189591" lastFinishedPulling="2026-02-25 11:09:50.136401725 +0000 UTC m=+1015.634983790" observedRunningTime="2026-02-25 11:09:54.237791985 +0000 UTC m=+1019.736374010" watchObservedRunningTime="2026-02-25 11:09:54.244131984 +0000 UTC m=+1019.742714009" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.261927 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" podStartSLOduration=6.250829762 podStartE2EDuration="28.261911607s" podCreationTimestamp="2026-02-25 11:09:26 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.129988565 +0000 UTC m=+993.628570590" lastFinishedPulling="2026-02-25 11:09:50.14107038 +0000 UTC m=+1015.639652435" observedRunningTime="2026-02-25 11:09:54.256325069 +0000 UTC m=+1019.754907094" watchObservedRunningTime="2026-02-25 11:09:54.261911607 +0000 UTC m=+1019.760493632" Feb 25 11:09:54 crc kubenswrapper[4725]: I0225 11:09:54.276923 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" podStartSLOduration=5.630669477 podStartE2EDuration="27.276911307s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.495022065 +0000 UTC m=+993.993604090" lastFinishedPulling="2026-02-25 11:09:50.141263865 +0000 UTC m=+1015.639845920" observedRunningTime="2026-02-25 11:09:54.27587335 +0000 UTC m=+1019.774455375" watchObservedRunningTime="2026-02-25 11:09:54.276911307 +0000 UTC m=+1019.775493332" Feb 25 11:09:55 crc kubenswrapper[4725]: I0225 11:09:55.145914 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerID="4ea1742664c14f3e7bbccd863ab8b2a2e15504d64fa7773effef7753b4fdefb5" exitCode=0 Feb 25 11:09:55 crc kubenswrapper[4725]: I0225 11:09:55.145972 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66gx8" event={"ID":"5c09b8a8-b815-45fb-9ef1-8e78844135cc","Type":"ContainerDied","Data":"4ea1742664c14f3e7bbccd863ab8b2a2e15504d64fa7773effef7753b4fdefb5"} Feb 25 11:09:55 crc kubenswrapper[4725]: I0225 11:09:55.168879 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" podStartSLOduration=6.489972444 podStartE2EDuration="28.168863114s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.462704184 +0000 UTC m=+993.961286209" lastFinishedPulling="2026-02-25 11:09:50.141594824 +0000 UTC m=+1015.640176879" observedRunningTime="2026-02-25 11:09:54.306918297 +0000 UTC m=+1019.805500322" watchObservedRunningTime="2026-02-25 11:09:55.168863114 +0000 UTC m=+1020.667445149" Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.166816 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" event={"ID":"b82c26d2-a08f-4c57-a876-9ac8a87c1fcf","Type":"ContainerStarted","Data":"606cfbecee11f26c5a0aea7aaccf78533db418aeb30f24fc0343229bd5c13272"} Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.167505 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.171157 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66gx8" event={"ID":"5c09b8a8-b815-45fb-9ef1-8e78844135cc","Type":"ContainerStarted","Data":"be1dcca38271cd0d8f93de0cc86396eaae0d22fd411d4ea0f3cff72857247ebf"} Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.173384 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" event={"ID":"2fbb069d-66ce-4d87-9fcb-f82181bd85e9","Type":"ContainerStarted","Data":"cd8ff96f081bb7d2ebec381b31a185f290150b399f9a9457ebb515621a6170a3"} Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.174256 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.197234 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" podStartSLOduration=27.255066256 podStartE2EDuration="30.197212594s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:53.421625449 +0000 UTC m=+1018.920207474" lastFinishedPulling="2026-02-25 11:09:56.363771787 +0000 UTC m=+1021.862353812" observedRunningTime="2026-02-25 11:09:57.190983938 +0000 UTC m=+1022.689565993" watchObservedRunningTime="2026-02-25 11:09:57.197212594 +0000 UTC m=+1022.695794639" Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.238342 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" podStartSLOduration=27.329455059 podStartE2EDuration="30.2383203s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:53.451418123 +0000 UTC m=+1018.950000148" lastFinishedPulling="2026-02-25 11:09:56.360283354 +0000 UTC m=+1021.858865389" observedRunningTime="2026-02-25 11:09:57.236847771 +0000 UTC m=+1022.735429796" watchObservedRunningTime="2026-02-25 11:09:57.2383203 +0000 UTC m=+1022.736902345" Feb 25 11:09:57 crc kubenswrapper[4725]: I0225 11:09:57.262263 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66gx8" podStartSLOduration=15.945401999 podStartE2EDuration="18.262248248s" podCreationTimestamp="2026-02-25 11:09:39 +0000 UTC" firstStartedPulling="2026-02-25 11:09:54.042851589 +0000 UTC m=+1019.541433614" lastFinishedPulling="2026-02-25 11:09:56.359697828 +0000 UTC m=+1021.858279863" observedRunningTime="2026-02-25 11:09:57.25932403 +0000 UTC m=+1022.757906055" watchObservedRunningTime="2026-02-25 11:09:57.262248248 +0000 UTC m=+1022.760830273" Feb 25 11:09:58 crc kubenswrapper[4725]: I0225 11:09:58.059733 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-t2ncn" Feb 25 11:09:58 crc kubenswrapper[4725]: I0225 11:09:58.115945 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-8gchs" Feb 25 11:09:59 crc kubenswrapper[4725]: I0225 11:09:59.202387 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" event={"ID":"41775582-fd78-4c34-93fc-60b9cdc55a2c","Type":"ContainerStarted","Data":"524ef4b23325d0b98b44ad408b058f0565bde0c5c24fe8f913dba593f9b33467"} Feb 25 11:09:59 crc kubenswrapper[4725]: I0225 11:09:59.202960 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" Feb 25 11:09:59 crc kubenswrapper[4725]: I0225 11:09:59.228238 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" podStartSLOduration=2.8565187720000003 podStartE2EDuration="33.228215314s" podCreationTimestamp="2026-02-25 11:09:26 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.31101507 +0000 UTC m=+993.809597095" lastFinishedPulling="2026-02-25 11:09:58.682711612 +0000 UTC m=+1024.181293637" observedRunningTime="2026-02-25 11:09:59.225509402 +0000 UTC m=+1024.724091467" watchObservedRunningTime="2026-02-25 11:09:59.228215314 +0000 UTC m=+1024.726797359" Feb 25 11:09:59 crc kubenswrapper[4725]: I0225 11:09:59.631454 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:59 crc kubenswrapper[4725]: I0225 11:09:59.631513 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:09:59 crc kubenswrapper[4725]: I0225 11:09:59.698711 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.154945 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533630-v7bl7"] Feb 25 11:10:00 crc kubenswrapper[4725]: E0225 11:10:00.155500 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="extract-content" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.155544 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="extract-content" Feb 25 11:10:00 crc kubenswrapper[4725]: E0225 11:10:00.155577 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="extract-utilities" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.155596 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="extract-utilities" Feb 25 11:10:00 crc kubenswrapper[4725]: E0225 11:10:00.155641 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="registry-server" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.155660 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="registry-server" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.156070 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0cca41d-a3cc-4060-becc-ba00a60dd9bc" containerName="registry-server" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.157083 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533630-v7bl7" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.159508 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.159636 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.161432 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28krf\" (UniqueName: \"kubernetes.io/projected/22e6596a-9d15-422f-8436-5c3ea71de9a6-kube-api-access-28krf\") pod \"auto-csr-approver-29533630-v7bl7\" (UID: \"22e6596a-9d15-422f-8436-5c3ea71de9a6\") " pod="openshift-infra/auto-csr-approver-29533630-v7bl7" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.162492 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.168103 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533630-v7bl7"] Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.212199 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" event={"ID":"e1b06e72-2952-4eee-9732-af05abc6a117","Type":"ContainerStarted","Data":"687b33b1b32a57b96aad3b6ee7b2c77754989a3736005627749108db58ff6216"} Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.214041 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.231173 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" podStartSLOduration=2.6379863930000003 podStartE2EDuration="33.231153969s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:29.07105361 +0000 UTC m=+994.569635635" lastFinishedPulling="2026-02-25 11:09:59.664221146 +0000 UTC m=+1025.162803211" observedRunningTime="2026-02-25 11:10:00.225346704 +0000 UTC m=+1025.723928739" watchObservedRunningTime="2026-02-25 11:10:00.231153969 +0000 UTC m=+1025.729736014" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.262485 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28krf\" (UniqueName: \"kubernetes.io/projected/22e6596a-9d15-422f-8436-5c3ea71de9a6-kube-api-access-28krf\") pod \"auto-csr-approver-29533630-v7bl7\" (UID: \"22e6596a-9d15-422f-8436-5c3ea71de9a6\") " pod="openshift-infra/auto-csr-approver-29533630-v7bl7" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.291545 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28krf\" (UniqueName: \"kubernetes.io/projected/22e6596a-9d15-422f-8436-5c3ea71de9a6-kube-api-access-28krf\") pod \"auto-csr-approver-29533630-v7bl7\" (UID: \"22e6596a-9d15-422f-8436-5c3ea71de9a6\") " pod="openshift-infra/auto-csr-approver-29533630-v7bl7" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.480226 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533630-v7bl7" Feb 25 11:10:00 crc kubenswrapper[4725]: I0225 11:10:00.966928 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533630-v7bl7"] Feb 25 11:10:00 crc kubenswrapper[4725]: W0225 11:10:00.975202 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22e6596a_9d15_422f_8436_5c3ea71de9a6.slice/crio-beb68d5ed0aecd5d0a85cb32f2ff5ebf441a58e11b70399987ec92de6881c51c WatchSource:0}: Error finding container beb68d5ed0aecd5d0a85cb32f2ff5ebf441a58e11b70399987ec92de6881c51c: Status 404 returned error can't find the container with id beb68d5ed0aecd5d0a85cb32f2ff5ebf441a58e11b70399987ec92de6881c51c Feb 25 11:10:01 crc kubenswrapper[4725]: I0225 11:10:01.221554 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" event={"ID":"37d48839-36c8-4a2c-ac3d-a4e5394b11eb","Type":"ContainerStarted","Data":"6535381feeb68d485091edc3697ad76a44c4d73af7e542f282af9bb2e71fdffc"} Feb 25 11:10:01 crc kubenswrapper[4725]: I0225 11:10:01.221779 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" Feb 25 11:10:01 crc kubenswrapper[4725]: I0225 11:10:01.233601 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533630-v7bl7" event={"ID":"22e6596a-9d15-422f-8436-5c3ea71de9a6","Type":"ContainerStarted","Data":"beb68d5ed0aecd5d0a85cb32f2ff5ebf441a58e11b70399987ec92de6881c51c"} Feb 25 11:10:01 crc kubenswrapper[4725]: I0225 11:10:01.233653 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" event={"ID":"01823ef1-1bcc-49f8-8cbc-37db7edc9fd0","Type":"ContainerStarted","Data":"587b068ce485ee044dfac3e3cf142578ec61339ec56aa371d6c156723bd13828"} Feb 25 11:10:01 crc kubenswrapper[4725]: I0225 11:10:01.233860 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" Feb 25 11:10:01 crc kubenswrapper[4725]: I0225 11:10:01.244346 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" podStartSLOduration=2.2645201090000002 podStartE2EDuration="34.244331868s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.773984422 +0000 UTC m=+994.272566447" lastFinishedPulling="2026-02-25 11:10:00.753796181 +0000 UTC m=+1026.252378206" observedRunningTime="2026-02-25 11:10:01.242799967 +0000 UTC m=+1026.741382002" watchObservedRunningTime="2026-02-25 11:10:01.244331868 +0000 UTC m=+1026.742913903" Feb 25 11:10:01 crc kubenswrapper[4725]: I0225 11:10:01.258617 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" podStartSLOduration=2.486693011 podStartE2EDuration="34.258600248s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.895195173 +0000 UTC m=+994.393777198" lastFinishedPulling="2026-02-25 11:10:00.6671024 +0000 UTC m=+1026.165684435" observedRunningTime="2026-02-25 11:10:01.252953257 +0000 UTC m=+1026.751535292" watchObservedRunningTime="2026-02-25 11:10:01.258600248 +0000 UTC m=+1026.757182283" Feb 25 11:10:03 crc kubenswrapper[4725]: I0225 11:10:03.249444 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" event={"ID":"9a7b2bf7-fab5-4634-9dfa-147dc2de21bc","Type":"ContainerStarted","Data":"d0167add1ac32ac96cd37c2ddd6a290b450e532c8d9b9fc69c0b4bd17a1b1045"} Feb 25 11:10:03 crc kubenswrapper[4725]: I0225 11:10:03.250026 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" Feb 25 11:10:03 crc kubenswrapper[4725]: I0225 11:10:03.416748 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6872z" Feb 25 11:10:03 crc kubenswrapper[4725]: I0225 11:10:03.434080 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" podStartSLOduration=2.072313355 podStartE2EDuration="36.434064749s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.492891919 +0000 UTC m=+993.991473934" lastFinishedPulling="2026-02-25 11:10:02.854643293 +0000 UTC m=+1028.353225328" observedRunningTime="2026-02-25 11:10:03.270624432 +0000 UTC m=+1028.769206557" watchObservedRunningTime="2026-02-25 11:10:03.434064749 +0000 UTC m=+1028.932646774" Feb 25 11:10:03 crc kubenswrapper[4725]: I0225 11:10:03.814800 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd" Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.108682 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7489bcf59c-kb5pq" Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.256395 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" event={"ID":"0279e1a1-c275-48e8-815c-0afae718b93a","Type":"ContainerStarted","Data":"1014a561d44d809660ea0fa9853d408f8913a8e20383d95b1da5d0a78d6dce0d"} Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.256672 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.258382 4725 generic.go:334] "Generic (PLEG): container finished" podID="22e6596a-9d15-422f-8436-5c3ea71de9a6" containerID="1d69f749f1434c1c7237a4c7672735e636b9586bade94b08610ec1bbebc6cc47" exitCode=0 Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.258459 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533630-v7bl7" event={"ID":"22e6596a-9d15-422f-8436-5c3ea71de9a6","Type":"ContainerDied","Data":"1d69f749f1434c1c7237a4c7672735e636b9586bade94b08610ec1bbebc6cc47"} Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.260569 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" event={"ID":"c07a7a9d-d976-4d10-af1d-b92b5da76d71","Type":"ContainerStarted","Data":"83f7c4ece7835bb65332b7fefcb8c80124974ffb80b2b503bf54234c2f712f8d"} Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.260741 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.262126 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" event={"ID":"6cf86133-a9ef-4a8b-a957-ef8e588b200e","Type":"ContainerStarted","Data":"1c55d32e1e25c6c079fe974f9d1566c7ba6f91e041dc8c701be67b39e11c0a0d"} Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.262456 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.274287 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" podStartSLOduration=2.807138683 podStartE2EDuration="37.274268196s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.507873408 +0000 UTC m=+994.006455423" lastFinishedPulling="2026-02-25 11:10:02.975002911 +0000 UTC m=+1028.473584936" observedRunningTime="2026-02-25 11:10:04.269250902 +0000 UTC m=+1029.767832917" watchObservedRunningTime="2026-02-25 11:10:04.274268196 +0000 UTC m=+1029.772850221" Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.301970 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" podStartSLOduration=2.8973169260000002 podStartE2EDuration="37.301954784s" podCreationTimestamp="2026-02-25 11:09:27 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.623061008 +0000 UTC m=+994.121643033" lastFinishedPulling="2026-02-25 11:10:03.027698866 +0000 UTC m=+1028.526280891" observedRunningTime="2026-02-25 11:10:04.298694717 +0000 UTC m=+1029.797276752" watchObservedRunningTime="2026-02-25 11:10:04.301954784 +0000 UTC m=+1029.800536819" Feb 25 11:10:04 crc kubenswrapper[4725]: I0225 11:10:04.319674 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" podStartSLOduration=4.077166381 podStartE2EDuration="38.319655556s" podCreationTimestamp="2026-02-25 11:09:26 +0000 UTC" firstStartedPulling="2026-02-25 11:09:28.751211515 +0000 UTC m=+994.249793540" lastFinishedPulling="2026-02-25 11:10:02.99370069 +0000 UTC m=+1028.492282715" observedRunningTime="2026-02-25 11:10:04.314044486 +0000 UTC m=+1029.812626521" watchObservedRunningTime="2026-02-25 11:10:04.319655556 +0000 UTC m=+1029.818237581" Feb 25 11:10:05 crc kubenswrapper[4725]: I0225 11:10:05.557788 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533630-v7bl7" Feb 25 11:10:05 crc kubenswrapper[4725]: I0225 11:10:05.639396 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28krf\" (UniqueName: \"kubernetes.io/projected/22e6596a-9d15-422f-8436-5c3ea71de9a6-kube-api-access-28krf\") pod \"22e6596a-9d15-422f-8436-5c3ea71de9a6\" (UID: \"22e6596a-9d15-422f-8436-5c3ea71de9a6\") " Feb 25 11:10:05 crc kubenswrapper[4725]: I0225 11:10:05.649371 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e6596a-9d15-422f-8436-5c3ea71de9a6-kube-api-access-28krf" (OuterVolumeSpecName: "kube-api-access-28krf") pod "22e6596a-9d15-422f-8436-5c3ea71de9a6" (UID: "22e6596a-9d15-422f-8436-5c3ea71de9a6"). InnerVolumeSpecName "kube-api-access-28krf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:10:05 crc kubenswrapper[4725]: I0225 11:10:05.740957 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28krf\" (UniqueName: \"kubernetes.io/projected/22e6596a-9d15-422f-8436-5c3ea71de9a6-kube-api-access-28krf\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:06 crc kubenswrapper[4725]: I0225 11:10:06.284754 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533630-v7bl7" event={"ID":"22e6596a-9d15-422f-8436-5c3ea71de9a6","Type":"ContainerDied","Data":"beb68d5ed0aecd5d0a85cb32f2ff5ebf441a58e11b70399987ec92de6881c51c"} Feb 25 11:10:06 crc kubenswrapper[4725]: I0225 11:10:06.284821 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beb68d5ed0aecd5d0a85cb32f2ff5ebf441a58e11b70399987ec92de6881c51c" Feb 25 11:10:06 crc kubenswrapper[4725]: I0225 11:10:06.284913 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533630-v7bl7" Feb 25 11:10:06 crc kubenswrapper[4725]: I0225 11:10:06.640207 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533624-r2kgf"] Feb 25 11:10:06 crc kubenswrapper[4725]: I0225 11:10:06.647694 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533624-r2kgf"] Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.235922 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3f5247-3533-4360-a134-2c2d24332e5f" path="/var/lib/kubelet/pods/ac3f5247-3533-4360-a134-2c2d24332e5f/volumes" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.248685 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-65rfv" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.269068 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-gm94c" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.349523 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-97g26" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.481378 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-wj5dw" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.523488 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-l278b" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.682560 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-h2tmg" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.729268 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6s7s5" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.828662 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-kn6fp" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.868857 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8fthg" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.894099 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-lgqlc" Feb 25 11:10:07 crc kubenswrapper[4725]: I0225 11:10:07.969677 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-v8c26" Feb 25 11:10:08 crc kubenswrapper[4725]: I0225 11:10:08.035369 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mvqqg" Feb 25 11:10:08 crc kubenswrapper[4725]: I0225 11:10:08.140644 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-6lfbp" Feb 25 11:10:09 crc kubenswrapper[4725]: I0225 11:10:09.695486 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:10:09 crc kubenswrapper[4725]: I0225 11:10:09.759000 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66gx8"] Feb 25 11:10:10 crc kubenswrapper[4725]: I0225 11:10:10.314973 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66gx8" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="registry-server" containerID="cri-o://be1dcca38271cd0d8f93de0cc86396eaae0d22fd411d4ea0f3cff72857247ebf" gracePeriod=2 Feb 25 11:10:11 crc kubenswrapper[4725]: I0225 11:10:11.326492 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerID="be1dcca38271cd0d8f93de0cc86396eaae0d22fd411d4ea0f3cff72857247ebf" exitCode=0 Feb 25 11:10:11 crc kubenswrapper[4725]: I0225 11:10:11.326645 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66gx8" event={"ID":"5c09b8a8-b815-45fb-9ef1-8e78844135cc","Type":"ContainerDied","Data":"be1dcca38271cd0d8f93de0cc86396eaae0d22fd411d4ea0f3cff72857247ebf"} Feb 25 11:10:11 crc kubenswrapper[4725]: I0225 11:10:11.555939 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:10:11 crc kubenswrapper[4725]: I0225 11:10:11.556020 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:10:11 crc kubenswrapper[4725]: I0225 11:10:11.556081 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:10:11 crc kubenswrapper[4725]: I0225 11:10:11.556738 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7caa77cf5b27b9b598253176495f0fa2415fb90743494a0dd02b8750c84c33d8"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:10:11 crc kubenswrapper[4725]: I0225 11:10:11.556814 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://7caa77cf5b27b9b598253176495f0fa2415fb90743494a0dd02b8750c84c33d8" gracePeriod=600 Feb 25 11:10:12 crc kubenswrapper[4725]: I0225 11:10:12.339622 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="7caa77cf5b27b9b598253176495f0fa2415fb90743494a0dd02b8750c84c33d8" exitCode=0 Feb 25 11:10:12 crc kubenswrapper[4725]: I0225 11:10:12.339657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"7caa77cf5b27b9b598253176495f0fa2415fb90743494a0dd02b8750c84c33d8"} Feb 25 11:10:12 crc kubenswrapper[4725]: I0225 11:10:12.339720 4725 scope.go:117] "RemoveContainer" containerID="976e63b74d2c07989af044494938e1fa71027bc94145eac91a1d7ca390924f15" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.193642 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.278327 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-utilities\") pod \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.278361 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-catalog-content\") pod \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.278434 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2nzw\" (UniqueName: \"kubernetes.io/projected/5c09b8a8-b815-45fb-9ef1-8e78844135cc-kube-api-access-j2nzw\") pod \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\" (UID: \"5c09b8a8-b815-45fb-9ef1-8e78844135cc\") " Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.285129 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c09b8a8-b815-45fb-9ef1-8e78844135cc-kube-api-access-j2nzw" (OuterVolumeSpecName: "kube-api-access-j2nzw") pod "5c09b8a8-b815-45fb-9ef1-8e78844135cc" (UID: "5c09b8a8-b815-45fb-9ef1-8e78844135cc"). InnerVolumeSpecName "kube-api-access-j2nzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.286969 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-utilities" (OuterVolumeSpecName: "utilities") pod "5c09b8a8-b815-45fb-9ef1-8e78844135cc" (UID: "5c09b8a8-b815-45fb-9ef1-8e78844135cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.309738 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c09b8a8-b815-45fb-9ef1-8e78844135cc" (UID: "5c09b8a8-b815-45fb-9ef1-8e78844135cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.379158 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66gx8" event={"ID":"5c09b8a8-b815-45fb-9ef1-8e78844135cc","Type":"ContainerDied","Data":"ea70d16f23c6a3b7b9bd50ba8ad8398013bf970c14eed60c04c7793efb94c443"} Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.379239 4725 scope.go:117] "RemoveContainer" containerID="be1dcca38271cd0d8f93de0cc86396eaae0d22fd411d4ea0f3cff72857247ebf" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.379280 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66gx8" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.380004 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.380588 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c09b8a8-b815-45fb-9ef1-8e78844135cc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.380640 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2nzw\" (UniqueName: \"kubernetes.io/projected/5c09b8a8-b815-45fb-9ef1-8e78844135cc-kube-api-access-j2nzw\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.406730 4725 scope.go:117] "RemoveContainer" containerID="4ea1742664c14f3e7bbccd863ab8b2a2e15504d64fa7773effef7753b4fdefb5" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.428109 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66gx8"] Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.435605 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66gx8"] Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.447765 4725 scope.go:117] "RemoveContainer" containerID="3bb3088341be3202cf583e56552012834cb365f99b53c50f59d702883180baed" Feb 25 11:10:15 crc kubenswrapper[4725]: I0225 11:10:15.664779 4725 scope.go:117] "RemoveContainer" containerID="eb296869dd57a44ec8e543377f62d34b7303f1b82deb6c47732be71305eb5f20" Feb 25 11:10:16 crc kubenswrapper[4725]: I0225 11:10:16.391707 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"e9d1cf00d5958f238b464e2eb2f371e000d949ef3901a3f7ece30337723bea95"} Feb 25 11:10:17 crc kubenswrapper[4725]: I0225 11:10:17.231799 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" path="/var/lib/kubelet/pods/5c09b8a8-b815-45fb-9ef1-8e78844135cc/volumes" Feb 25 11:10:17 crc kubenswrapper[4725]: I0225 11:10:17.664263 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2vhq7" Feb 25 11:10:17 crc kubenswrapper[4725]: I0225 11:10:17.694239 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-25sql" Feb 25 11:10:17 crc kubenswrapper[4725]: I0225 11:10:17.742312 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-pxnr7" Feb 25 11:10:17 crc kubenswrapper[4725]: I0225 11:10:17.746773 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-j4hbq" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.749638 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jp65c"] Feb 25 11:10:34 crc kubenswrapper[4725]: E0225 11:10:34.750414 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="extract-content" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.750426 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="extract-content" Feb 25 11:10:34 crc kubenswrapper[4725]: E0225 11:10:34.750444 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e6596a-9d15-422f-8436-5c3ea71de9a6" containerName="oc" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.750451 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e6596a-9d15-422f-8436-5c3ea71de9a6" containerName="oc" Feb 25 11:10:34 crc kubenswrapper[4725]: E0225 11:10:34.750463 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="extract-utilities" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.750469 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="extract-utilities" Feb 25 11:10:34 crc kubenswrapper[4725]: E0225 11:10:34.750482 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="registry-server" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.750489 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="registry-server" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.750607 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c09b8a8-b815-45fb-9ef1-8e78844135cc" containerName="registry-server" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.750626 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="22e6596a-9d15-422f-8436-5c3ea71de9a6" containerName="oc" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.751364 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.752977 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.753251 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.753263 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.753566 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kp2c7" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.759712 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jp65c"] Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.819869 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pllrl"] Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.833366 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.837706 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.847506 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pllrl"] Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.864100 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swq6v\" (UniqueName: \"kubernetes.io/projected/4092032e-ef2c-430b-bded-a96402f6b6c8-kube-api-access-swq6v\") pod \"dnsmasq-dns-675f4bcbfc-jp65c\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.864164 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4092032e-ef2c-430b-bded-a96402f6b6c8-config\") pod \"dnsmasq-dns-675f4bcbfc-jp65c\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.864189 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tlxt\" (UniqueName: \"kubernetes.io/projected/24f2045c-8355-4a2c-9957-520347d789f3-kube-api-access-9tlxt\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.864233 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.864262 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-config\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.965177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4092032e-ef2c-430b-bded-a96402f6b6c8-config\") pod \"dnsmasq-dns-675f4bcbfc-jp65c\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.965261 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tlxt\" (UniqueName: \"kubernetes.io/projected/24f2045c-8355-4a2c-9957-520347d789f3-kube-api-access-9tlxt\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.965342 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.965408 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-config\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.965498 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swq6v\" (UniqueName: \"kubernetes.io/projected/4092032e-ef2c-430b-bded-a96402f6b6c8-kube-api-access-swq6v\") pod \"dnsmasq-dns-675f4bcbfc-jp65c\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.966479 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.966875 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4092032e-ef2c-430b-bded-a96402f6b6c8-config\") pod \"dnsmasq-dns-675f4bcbfc-jp65c\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.967219 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-config\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.987135 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swq6v\" (UniqueName: \"kubernetes.io/projected/4092032e-ef2c-430b-bded-a96402f6b6c8-kube-api-access-swq6v\") pod \"dnsmasq-dns-675f4bcbfc-jp65c\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:34 crc kubenswrapper[4725]: I0225 11:10:34.998429 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tlxt\" (UniqueName: \"kubernetes.io/projected/24f2045c-8355-4a2c-9957-520347d789f3-kube-api-access-9tlxt\") pod \"dnsmasq-dns-78dd6ddcc-pllrl\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:35 crc kubenswrapper[4725]: I0225 11:10:35.112498 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:35 crc kubenswrapper[4725]: I0225 11:10:35.153640 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:35 crc kubenswrapper[4725]: W0225 11:10:35.606007 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4092032e_ef2c_430b_bded_a96402f6b6c8.slice/crio-9e672a0c0166e61ff33af68398e3beb208e9363ebf4d198b0f424ecc69f2e685 WatchSource:0}: Error finding container 9e672a0c0166e61ff33af68398e3beb208e9363ebf4d198b0f424ecc69f2e685: Status 404 returned error can't find the container with id 9e672a0c0166e61ff33af68398e3beb208e9363ebf4d198b0f424ecc69f2e685 Feb 25 11:10:35 crc kubenswrapper[4725]: I0225 11:10:35.607609 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jp65c"] Feb 25 11:10:35 crc kubenswrapper[4725]: I0225 11:10:35.651959 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pllrl"] Feb 25 11:10:35 crc kubenswrapper[4725]: W0225 11:10:35.666978 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24f2045c_8355_4a2c_9957_520347d789f3.slice/crio-f5833b85c783d4a1136116059dccc886bce76bfab20c696386eb0416bd8c7db7 WatchSource:0}: Error finding container f5833b85c783d4a1136116059dccc886bce76bfab20c696386eb0416bd8c7db7: Status 404 returned error can't find the container with id f5833b85c783d4a1136116059dccc886bce76bfab20c696386eb0416bd8c7db7 Feb 25 11:10:36 crc kubenswrapper[4725]: I0225 11:10:36.562274 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" event={"ID":"4092032e-ef2c-430b-bded-a96402f6b6c8","Type":"ContainerStarted","Data":"9e672a0c0166e61ff33af68398e3beb208e9363ebf4d198b0f424ecc69f2e685"} Feb 25 11:10:36 crc kubenswrapper[4725]: I0225 11:10:36.563636 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" event={"ID":"24f2045c-8355-4a2c-9957-520347d789f3","Type":"ContainerStarted","Data":"f5833b85c783d4a1136116059dccc886bce76bfab20c696386eb0416bd8c7db7"} Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.414524 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jp65c"] Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.446369 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4kxtq"] Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.447408 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.455372 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4kxtq"] Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.505693 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.505750 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4h7\" (UniqueName: \"kubernetes.io/projected/15d73a7a-5441-4c0b-812e-f57ed4bdd594-kube-api-access-kv4h7\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.505774 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-config\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.606762 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.606811 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4h7\" (UniqueName: \"kubernetes.io/projected/15d73a7a-5441-4c0b-812e-f57ed4bdd594-kube-api-access-kv4h7\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.606848 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-config\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.608036 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-config\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.608089 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.629410 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4h7\" (UniqueName: \"kubernetes.io/projected/15d73a7a-5441-4c0b-812e-f57ed4bdd594-kube-api-access-kv4h7\") pod \"dnsmasq-dns-666b6646f7-4kxtq\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.742306 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pllrl"] Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.759737 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xrrkb"] Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.764813 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.776907 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.778026 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xrrkb"] Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.812494 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.812541 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxq6\" (UniqueName: \"kubernetes.io/projected/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-kube-api-access-mtxq6\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.812560 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-config\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.915536 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.915600 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxq6\" (UniqueName: \"kubernetes.io/projected/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-kube-api-access-mtxq6\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.915619 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-config\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.916510 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-config\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.917052 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:37 crc kubenswrapper[4725]: I0225 11:10:37.943212 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxq6\" (UniqueName: \"kubernetes.io/projected/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-kube-api-access-mtxq6\") pod \"dnsmasq-dns-57d769cc4f-xrrkb\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.092397 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.318796 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4kxtq"] Feb 25 11:10:38 crc kubenswrapper[4725]: W0225 11:10:38.339038 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d73a7a_5441_4c0b_812e_f57ed4bdd594.slice/crio-16de003be0a226da33b6691f831130ec508b1822789aeada71dd042e63bf8116 WatchSource:0}: Error finding container 16de003be0a226da33b6691f831130ec508b1822789aeada71dd042e63bf8116: Status 404 returned error can't find the container with id 16de003be0a226da33b6691f831130ec508b1822789aeada71dd042e63bf8116 Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.557092 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xrrkb"] Feb 25 11:10:38 crc kubenswrapper[4725]: W0225 11:10:38.562342 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4efcc1fc_3f0d_42c6_81bc_b9b5797279a3.slice/crio-58621bd4fd3545459bb904f533102681fae8519b1d5f06f44114632c0f627f26 WatchSource:0}: Error finding container 58621bd4fd3545459bb904f533102681fae8519b1d5f06f44114632c0f627f26: Status 404 returned error can't find the container with id 58621bd4fd3545459bb904f533102681fae8519b1d5f06f44114632c0f627f26 Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.579864 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" event={"ID":"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3","Type":"ContainerStarted","Data":"58621bd4fd3545459bb904f533102681fae8519b1d5f06f44114632c0f627f26"} Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.582271 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" event={"ID":"15d73a7a-5441-4c0b-812e-f57ed4bdd594","Type":"ContainerStarted","Data":"16de003be0a226da33b6691f831130ec508b1822789aeada71dd042e63bf8116"} Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.609267 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.610847 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.614893 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mmfh7" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.615109 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.615286 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.615483 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.615901 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.615954 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.616066 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.622076 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728439 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728488 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728524 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728549 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728585 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57e7a103-f119-4d8e-bb7f-96f36b66994e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728609 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57e7a103-f119-4d8e-bb7f-96f36b66994e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728641 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728668 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728691 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6dw\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-kube-api-access-tv6dw\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728820 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.728971 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-config-data\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831293 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-config-data\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831370 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831395 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831418 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831435 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831464 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57e7a103-f119-4d8e-bb7f-96f36b66994e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831483 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57e7a103-f119-4d8e-bb7f-96f36b66994e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831506 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831528 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831579 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6dw\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-kube-api-access-tv6dw\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.831603 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.832012 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.833635 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.833881 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.833923 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.834239 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.834604 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-config-data\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.837449 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57e7a103-f119-4d8e-bb7f-96f36b66994e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.838005 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.850865 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57e7a103-f119-4d8e-bb7f-96f36b66994e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.854687 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.854929 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6dw\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-kube-api-access-tv6dw\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.857221 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " pod="openstack/rabbitmq-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.897895 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.899120 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.911025 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.911305 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.911362 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gw6sm" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.911468 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.911613 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.911752 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.911945 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.918715 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933115 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933167 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933189 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933230 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d1a511fd-4696-456a-8263-da4cd2f5eff1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933249 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq254\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-kube-api-access-jq254\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933299 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933331 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933351 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d1a511fd-4696-456a-8263-da4cd2f5eff1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933400 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.933426 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:38 crc kubenswrapper[4725]: I0225 11:10:38.973863 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034599 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034648 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034669 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034706 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d1a511fd-4696-456a-8263-da4cd2f5eff1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034743 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034758 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq254\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-kube-api-access-jq254\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034777 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034895 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034913 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d1a511fd-4696-456a-8263-da4cd2f5eff1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034945 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.034964 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.035759 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.035885 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.036358 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.036812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.040188 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.042981 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.043376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d1a511fd-4696-456a-8263-da4cd2f5eff1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.043986 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.056470 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.085083 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d1a511fd-4696-456a-8263-da4cd2f5eff1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.104004 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.111730 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq254\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-kube-api-access-jq254\") pod \"rabbitmq-cell1-server-0\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:39 crc kubenswrapper[4725]: I0225 11:10:39.248729 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.313633 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.316105 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.319563 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-frrnn" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.319784 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.319966 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.320865 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.328986 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.343696 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475268 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-kolla-config\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475324 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6c23a18-36cf-4d71-885d-f2b93ba16375-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475352 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475468 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92qx9\" (UniqueName: \"kubernetes.io/projected/a6c23a18-36cf-4d71-885d-f2b93ba16375-kube-api-access-92qx9\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475500 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475527 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c23a18-36cf-4d71-885d-f2b93ba16375-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475559 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c23a18-36cf-4d71-885d-f2b93ba16375-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.475590 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-config-data-default\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577294 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-kolla-config\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577359 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6c23a18-36cf-4d71-885d-f2b93ba16375-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577377 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577437 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92qx9\" (UniqueName: \"kubernetes.io/projected/a6c23a18-36cf-4d71-885d-f2b93ba16375-kube-api-access-92qx9\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577462 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577480 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c23a18-36cf-4d71-885d-f2b93ba16375-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577507 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c23a18-36cf-4d71-885d-f2b93ba16375-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.577529 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-config-data-default\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.578136 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-kolla-config\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.578475 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-config-data-default\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.578516 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6c23a18-36cf-4d71-885d-f2b93ba16375-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.578710 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.583416 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c23a18-36cf-4d71-885d-f2b93ba16375-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.585785 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c23a18-36cf-4d71-885d-f2b93ba16375-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.588422 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c23a18-36cf-4d71-885d-f2b93ba16375-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.593436 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92qx9\" (UniqueName: \"kubernetes.io/projected/a6c23a18-36cf-4d71-885d-f2b93ba16375-kube-api-access-92qx9\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.603956 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"a6c23a18-36cf-4d71-885d-f2b93ba16375\") " pod="openstack/openstack-galera-0" Feb 25 11:10:40 crc kubenswrapper[4725]: I0225 11:10:40.643192 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.532739 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.533931 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.535635 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.536290 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.536511 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.541445 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2q968" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.552501 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.696509 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.696553 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.696755 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ef16ee-b18a-4374-9b14-0d6e08df5558-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.697017 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ef16ee-b18a-4374-9b14-0d6e08df5558-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.697049 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/99ef16ee-b18a-4374-9b14-0d6e08df5558-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.697175 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgksj\" (UniqueName: \"kubernetes.io/projected/99ef16ee-b18a-4374-9b14-0d6e08df5558-kube-api-access-sgksj\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.697242 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.697269 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798479 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798622 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798654 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798687 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ef16ee-b18a-4374-9b14-0d6e08df5558-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798767 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/99ef16ee-b18a-4374-9b14-0d6e08df5558-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798810 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ef16ee-b18a-4374-9b14-0d6e08df5558-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798892 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgksj\" (UniqueName: \"kubernetes.io/projected/99ef16ee-b18a-4374-9b14-0d6e08df5558-kube-api-access-sgksj\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.798945 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.800339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.800790 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.800811 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/99ef16ee-b18a-4374-9b14-0d6e08df5558-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.801909 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.801989 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/99ef16ee-b18a-4374-9b14-0d6e08df5558-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.806738 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/99ef16ee-b18a-4374-9b14-0d6e08df5558-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.810552 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ef16ee-b18a-4374-9b14-0d6e08df5558-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.817572 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgksj\" (UniqueName: \"kubernetes.io/projected/99ef16ee-b18a-4374-9b14-0d6e08df5558-kube-api-access-sgksj\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.838002 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"99ef16ee-b18a-4374-9b14-0d6e08df5558\") " pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.841457 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.844366 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.846982 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.847339 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.847340 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-c6zhv" Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.864239 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 25 11:10:41 crc kubenswrapper[4725]: I0225 11:10:41.896364 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.002000 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w67bk\" (UniqueName: \"kubernetes.io/projected/a30e3088-499a-491e-a9b0-65e54ac709c9-kube-api-access-w67bk\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.002057 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a30e3088-499a-491e-a9b0-65e54ac709c9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.002076 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a30e3088-499a-491e-a9b0-65e54ac709c9-config-data\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.002190 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a30e3088-499a-491e-a9b0-65e54ac709c9-kolla-config\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.002215 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30e3088-499a-491e-a9b0-65e54ac709c9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.103779 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a30e3088-499a-491e-a9b0-65e54ac709c9-kolla-config\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.103846 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30e3088-499a-491e-a9b0-65e54ac709c9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.103884 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w67bk\" (UniqueName: \"kubernetes.io/projected/a30e3088-499a-491e-a9b0-65e54ac709c9-kube-api-access-w67bk\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.103923 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a30e3088-499a-491e-a9b0-65e54ac709c9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.103942 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a30e3088-499a-491e-a9b0-65e54ac709c9-config-data\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.105100 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a30e3088-499a-491e-a9b0-65e54ac709c9-config-data\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.105118 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a30e3088-499a-491e-a9b0-65e54ac709c9-kolla-config\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.122682 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30e3088-499a-491e-a9b0-65e54ac709c9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.123122 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a30e3088-499a-491e-a9b0-65e54ac709c9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.138118 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w67bk\" (UniqueName: \"kubernetes.io/projected/a30e3088-499a-491e-a9b0-65e54ac709c9-kube-api-access-w67bk\") pod \"memcached-0\" (UID: \"a30e3088-499a-491e-a9b0-65e54ac709c9\") " pod="openstack/memcached-0" Feb 25 11:10:42 crc kubenswrapper[4725]: I0225 11:10:42.196079 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.173092 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.174980 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.179110 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.190241 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ckl6t" Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.338912 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f98hm\" (UniqueName: \"kubernetes.io/projected/b330a7b3-8fd7-4db6-8d82-257570b2bd58-kube-api-access-f98hm\") pod \"kube-state-metrics-0\" (UID: \"b330a7b3-8fd7-4db6-8d82-257570b2bd58\") " pod="openstack/kube-state-metrics-0" Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.441764 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f98hm\" (UniqueName: \"kubernetes.io/projected/b330a7b3-8fd7-4db6-8d82-257570b2bd58-kube-api-access-f98hm\") pod \"kube-state-metrics-0\" (UID: \"b330a7b3-8fd7-4db6-8d82-257570b2bd58\") " pod="openstack/kube-state-metrics-0" Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.469421 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f98hm\" (UniqueName: \"kubernetes.io/projected/b330a7b3-8fd7-4db6-8d82-257570b2bd58-kube-api-access-f98hm\") pod \"kube-state-metrics-0\" (UID: \"b330a7b3-8fd7-4db6-8d82-257570b2bd58\") " pod="openstack/kube-state-metrics-0" Feb 25 11:10:44 crc kubenswrapper[4725]: I0225 11:10:44.521773 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.847228 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xpvnr"] Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.848949 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.854693 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.854995 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2sh5x" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.855241 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.859954 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xpvnr"] Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.877546 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-drphb"] Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.885746 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.914314 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-drphb"] Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.998482 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2445fb4-75ca-4ea2-b979-5757105279ab-combined-ca-bundle\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.998577 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493d04a9-b969-4c11-bd84-a1e9d57b7772-scripts\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.998642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-lib\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.998665 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-log\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.998707 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-run\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.999103 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-etc-ovs\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:47 crc kubenswrapper[4725]: I0225 11:10:47.999907 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2445fb4-75ca-4ea2-b979-5757105279ab-ovn-controller-tls-certs\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.000020 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2445fb4-75ca-4ea2-b979-5757105279ab-scripts\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.000048 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g7hm\" (UniqueName: \"kubernetes.io/projected/d2445fb4-75ca-4ea2-b979-5757105279ab-kube-api-access-6g7hm\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.000101 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f999\" (UniqueName: \"kubernetes.io/projected/493d04a9-b969-4c11-bd84-a1e9d57b7772-kube-api-access-7f999\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.000129 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-run\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.000185 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-run-ovn\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.000281 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-log-ovn\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102300 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f999\" (UniqueName: \"kubernetes.io/projected/493d04a9-b969-4c11-bd84-a1e9d57b7772-kube-api-access-7f999\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102351 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-run\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102375 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-run-ovn\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102398 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-log-ovn\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102454 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2445fb4-75ca-4ea2-b979-5757105279ab-combined-ca-bundle\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102491 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493d04a9-b969-4c11-bd84-a1e9d57b7772-scripts\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102545 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-lib\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102568 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-log\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102597 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-run\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102625 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-etc-ovs\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102642 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2445fb4-75ca-4ea2-b979-5757105279ab-ovn-controller-tls-certs\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102666 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2445fb4-75ca-4ea2-b979-5757105279ab-scripts\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.102683 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g7hm\" (UniqueName: \"kubernetes.io/projected/d2445fb4-75ca-4ea2-b979-5757105279ab-kube-api-access-6g7hm\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.103041 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-run\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.103065 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-lib\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.103174 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-etc-ovs\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.103225 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-run\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.103248 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/493d04a9-b969-4c11-bd84-a1e9d57b7772-var-log\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.103308 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-run-ovn\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.103579 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d2445fb4-75ca-4ea2-b979-5757105279ab-var-log-ovn\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.107672 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493d04a9-b969-4c11-bd84-a1e9d57b7772-scripts\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.113004 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2445fb4-75ca-4ea2-b979-5757105279ab-combined-ca-bundle\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.115857 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2445fb4-75ca-4ea2-b979-5757105279ab-scripts\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.116332 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2445fb4-75ca-4ea2-b979-5757105279ab-ovn-controller-tls-certs\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.117315 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f999\" (UniqueName: \"kubernetes.io/projected/493d04a9-b969-4c11-bd84-a1e9d57b7772-kube-api-access-7f999\") pod \"ovn-controller-ovs-drphb\" (UID: \"493d04a9-b969-4c11-bd84-a1e9d57b7772\") " pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.133155 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g7hm\" (UniqueName: \"kubernetes.io/projected/d2445fb4-75ca-4ea2-b979-5757105279ab-kube-api-access-6g7hm\") pod \"ovn-controller-xpvnr\" (UID: \"d2445fb4-75ca-4ea2-b979-5757105279ab\") " pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.167347 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xpvnr" Feb 25 11:10:48 crc kubenswrapper[4725]: I0225 11:10:48.205724 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.570440 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.571707 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.574131 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.574414 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.574581 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.574915 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-d7h6g" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.575085 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.585267 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726255 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw7tt\" (UniqueName: \"kubernetes.io/projected/24e787b7-ef1d-4c61-b01a-f8119d7911c0-kube-api-access-cw7tt\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726332 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e787b7-ef1d-4c61-b01a-f8119d7911c0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726387 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726423 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726461 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24e787b7-ef1d-4c61-b01a-f8119d7911c0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726489 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e787b7-ef1d-4c61-b01a-f8119d7911c0-config\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726517 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.726540 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.827839 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24e787b7-ef1d-4c61-b01a-f8119d7911c0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.827882 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e787b7-ef1d-4c61-b01a-f8119d7911c0-config\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.827908 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.827929 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.827952 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw7tt\" (UniqueName: \"kubernetes.io/projected/24e787b7-ef1d-4c61-b01a-f8119d7911c0-kube-api-access-cw7tt\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.827981 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e787b7-ef1d-4c61-b01a-f8119d7911c0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.828021 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.828049 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.828443 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24e787b7-ef1d-4c61-b01a-f8119d7911c0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.829894 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.829995 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e787b7-ef1d-4c61-b01a-f8119d7911c0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.830124 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e787b7-ef1d-4c61-b01a-f8119d7911c0-config\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.835170 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.835891 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.850853 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e787b7-ef1d-4c61-b01a-f8119d7911c0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.852622 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw7tt\" (UniqueName: \"kubernetes.io/projected/24e787b7-ef1d-4c61-b01a-f8119d7911c0-kube-api-access-cw7tt\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.862766 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24e787b7-ef1d-4c61-b01a-f8119d7911c0\") " pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:49 crc kubenswrapper[4725]: I0225 11:10:49.892273 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.047986 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.060115 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.069766 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dbg9b" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.070274 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.070576 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.070928 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.104082 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.148495 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-config\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.148707 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.148787 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wffjq\" (UniqueName: \"kubernetes.io/projected/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-kube-api-access-wffjq\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.148964 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.149043 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.149292 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.149338 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.149369 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251256 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251352 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wffjq\" (UniqueName: \"kubernetes.io/projected/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-kube-api-access-wffjq\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251447 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251500 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251595 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251630 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251668 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251691 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.251728 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-config\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.252615 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.254385 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-config\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.255101 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.259002 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.260676 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.266682 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.271341 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wffjq\" (UniqueName: \"kubernetes.io/projected/818d1929-2446-4ce6-80ec-6ed3fdec2b3d-kube-api-access-wffjq\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.285458 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"818d1929-2446-4ce6-80ec-6ed3fdec2b3d\") " pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:51 crc kubenswrapper[4725]: I0225 11:10:51.402735 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.148468 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.148623 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.149631 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tlxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-pllrl_openstack(24f2045c-8355-4a2c-9957-520347d789f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.149631 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kv4h7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-4kxtq_openstack(15d73a7a-5441-4c0b-812e-f57ed4bdd594): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.150915 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" podUID="15d73a7a-5441-4c0b-812e-f57ed4bdd594" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.150969 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" podUID="24f2045c-8355-4a2c-9957-520347d789f3" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.192106 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.192277 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swq6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-jp65c_openstack(4092032e-ef2c-430b-bded-a96402f6b6c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.193528 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" podUID="4092032e-ef2c-430b-bded-a96402f6b6c8" Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.690541 4725 generic.go:334] "Generic (PLEG): container finished" podID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerID="116fbcc0df11fbf02a24e3e8fc933cd7c66f193e71c982b54d937e64f57d4849" exitCode=0 Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.690712 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" event={"ID":"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3","Type":"ContainerDied","Data":"116fbcc0df11fbf02a24e3e8fc933cd7c66f193e71c982b54d937e64f57d4849"} Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.744348 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.802108 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:10:52 crc kubenswrapper[4725]: W0225 11:10:52.891268 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2445fb4_75ca_4ea2_b979_5757105279ab.slice/crio-0dfc8dfb506b971b478c3881001068830bba1f5d9139918486ddc99f9f7d687b WatchSource:0}: Error finding container 0dfc8dfb506b971b478c3881001068830bba1f5d9139918486ddc99f9f7d687b: Status 404 returned error can't find the container with id 0dfc8dfb506b971b478c3881001068830bba1f5d9139918486ddc99f9f7d687b Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.891811 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.902129 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xpvnr"] Feb 25 11:10:52 crc kubenswrapper[4725]: W0225 11:10:52.903449 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6c23a18_36cf_4d71_885d_f2b93ba16375.slice/crio-8c66f10ebe71ba1b5829058d29080800942711b6f0839cbe1d8fd6282e104b7c WatchSource:0}: Error finding container 8c66f10ebe71ba1b5829058d29080800942711b6f0839cbe1d8fd6282e104b7c: Status 404 returned error can't find the container with id 8c66f10ebe71ba1b5829058d29080800942711b6f0839cbe1d8fd6282e104b7c Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.906917 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.925371 4725 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 25 11:10:52 crc kubenswrapper[4725]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/15d73a7a-5441-4c0b-812e-f57ed4bdd594/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 25 11:10:52 crc kubenswrapper[4725]: > podSandboxID="16de003be0a226da33b6691f831130ec508b1822789aeada71dd042e63bf8116" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.925561 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 11:10:52 crc kubenswrapper[4725]: init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kv4h7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-4kxtq_openstack(15d73a7a-5441-4c0b-812e-f57ed4bdd594): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/15d73a7a-5441-4c0b-812e-f57ed4bdd594/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 25 11:10:52 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 25 11:10:52 crc kubenswrapper[4725]: E0225 11:10:52.926695 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/15d73a7a-5441-4c0b-812e-f57ed4bdd594/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" podUID="15d73a7a-5441-4c0b-812e-f57ed4bdd594" Feb 25 11:10:52 crc kubenswrapper[4725]: I0225 11:10:52.983597 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 11:10:52 crc kubenswrapper[4725]: W0225 11:10:52.986493 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24e787b7_ef1d_4c61_b01a_f8119d7911c0.slice/crio-01c73ac5f4562ec89ab9d4c6a563194cb62380c76a23fc8c8fad4d96a97928d2 WatchSource:0}: Error finding container 01c73ac5f4562ec89ab9d4c6a563194cb62380c76a23fc8c8fad4d96a97928d2: Status 404 returned error can't find the container with id 01c73ac5f4562ec89ab9d4c6a563194cb62380c76a23fc8c8fad4d96a97928d2 Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.093108 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.101169 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:10:53 crc kubenswrapper[4725]: W0225 11:10:53.123095 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a511fd_4696_456a_8263_da4cd2f5eff1.slice/crio-d4202ac9b5fd5d7a9e4d48adcaad80437647eb8924a79f2d618a7d282debb486 WatchSource:0}: Error finding container d4202ac9b5fd5d7a9e4d48adcaad80437647eb8924a79f2d618a7d282debb486: Status 404 returned error can't find the container with id d4202ac9b5fd5d7a9e4d48adcaad80437647eb8924a79f2d618a7d282debb486 Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.141218 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.147225 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 11:10:53 crc kubenswrapper[4725]: W0225 11:10:53.152422 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod818d1929_2446_4ce6_80ec_6ed3fdec2b3d.slice/crio-02ef3a951a9c055381dd8c0036f19a87b350e4fa2263880014b3c72a53f5284d WatchSource:0}: Error finding container 02ef3a951a9c055381dd8c0036f19a87b350e4fa2263880014b3c72a53f5284d: Status 404 returned error can't find the container with id 02ef3a951a9c055381dd8c0036f19a87b350e4fa2263880014b3c72a53f5284d Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.169615 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.213296 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4092032e-ef2c-430b-bded-a96402f6b6c8-config\") pod \"4092032e-ef2c-430b-bded-a96402f6b6c8\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.213433 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swq6v\" (UniqueName: \"kubernetes.io/projected/4092032e-ef2c-430b-bded-a96402f6b6c8-kube-api-access-swq6v\") pod \"4092032e-ef2c-430b-bded-a96402f6b6c8\" (UID: \"4092032e-ef2c-430b-bded-a96402f6b6c8\") " Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.213942 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4092032e-ef2c-430b-bded-a96402f6b6c8-config" (OuterVolumeSpecName: "config") pod "4092032e-ef2c-430b-bded-a96402f6b6c8" (UID: "4092032e-ef2c-430b-bded-a96402f6b6c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.221885 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4092032e-ef2c-430b-bded-a96402f6b6c8-kube-api-access-swq6v" (OuterVolumeSpecName: "kube-api-access-swq6v") pod "4092032e-ef2c-430b-bded-a96402f6b6c8" (UID: "4092032e-ef2c-430b-bded-a96402f6b6c8"). InnerVolumeSpecName "kube-api-access-swq6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.279606 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-drphb"] Feb 25 11:10:53 crc kubenswrapper[4725]: W0225 11:10:53.296679 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493d04a9_b969_4c11_bd84_a1e9d57b7772.slice/crio-2d565384ab95611c1b9eaac966bc0a49da62f5507720d6304ee10d8b241a121d WatchSource:0}: Error finding container 2d565384ab95611c1b9eaac966bc0a49da62f5507720d6304ee10d8b241a121d: Status 404 returned error can't find the container with id 2d565384ab95611c1b9eaac966bc0a49da62f5507720d6304ee10d8b241a121d Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.315020 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-config\") pod \"24f2045c-8355-4a2c-9957-520347d789f3\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.315075 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-dns-svc\") pod \"24f2045c-8355-4a2c-9957-520347d789f3\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.315136 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tlxt\" (UniqueName: \"kubernetes.io/projected/24f2045c-8355-4a2c-9957-520347d789f3-kube-api-access-9tlxt\") pod \"24f2045c-8355-4a2c-9957-520347d789f3\" (UID: \"24f2045c-8355-4a2c-9957-520347d789f3\") " Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.315573 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24f2045c-8355-4a2c-9957-520347d789f3" (UID: "24f2045c-8355-4a2c-9957-520347d789f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.315591 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-config" (OuterVolumeSpecName: "config") pod "24f2045c-8355-4a2c-9957-520347d789f3" (UID: "24f2045c-8355-4a2c-9957-520347d789f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.315930 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4092032e-ef2c-430b-bded-a96402f6b6c8-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.315980 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.316017 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24f2045c-8355-4a2c-9957-520347d789f3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.316033 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swq6v\" (UniqueName: \"kubernetes.io/projected/4092032e-ef2c-430b-bded-a96402f6b6c8-kube-api-access-swq6v\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.321709 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f2045c-8355-4a2c-9957-520347d789f3-kube-api-access-9tlxt" (OuterVolumeSpecName: "kube-api-access-9tlxt") pod "24f2045c-8355-4a2c-9957-520347d789f3" (UID: "24f2045c-8355-4a2c-9957-520347d789f3"). InnerVolumeSpecName "kube-api-access-9tlxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.422814 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tlxt\" (UniqueName: \"kubernetes.io/projected/24f2045c-8355-4a2c-9957-520347d789f3-kube-api-access-9tlxt\") on node \"crc\" DevicePath \"\"" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.701465 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57e7a103-f119-4d8e-bb7f-96f36b66994e","Type":"ContainerStarted","Data":"b3ce4e075f97844980fb715be3c14b371f81502dcee373e2ec83f9b4b7e04b07"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.703078 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a30e3088-499a-491e-a9b0-65e54ac709c9","Type":"ContainerStarted","Data":"501c4bc6d42431d679bd0bf83059073dbeff7a347b5f65fcc7fff7b1d841ff5d"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.708690 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6c23a18-36cf-4d71-885d-f2b93ba16375","Type":"ContainerStarted","Data":"8c66f10ebe71ba1b5829058d29080800942711b6f0839cbe1d8fd6282e104b7c"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.711094 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24e787b7-ef1d-4c61-b01a-f8119d7911c0","Type":"ContainerStarted","Data":"01c73ac5f4562ec89ab9d4c6a563194cb62380c76a23fc8c8fad4d96a97928d2"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.712978 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b330a7b3-8fd7-4db6-8d82-257570b2bd58","Type":"ContainerStarted","Data":"3847786803addb9937455ea335a5be3d8ba93568a72f7fb0c15d048abe56b0e5"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.714298 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xpvnr" event={"ID":"d2445fb4-75ca-4ea2-b979-5757105279ab","Type":"ContainerStarted","Data":"0dfc8dfb506b971b478c3881001068830bba1f5d9139918486ddc99f9f7d687b"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.719169 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" event={"ID":"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3","Type":"ContainerStarted","Data":"46dfedb0b8c045a8303726a41537738d3fa9e1d483dd0b7273834663cc71e3d5"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.719301 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.720750 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.720753 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pllrl" event={"ID":"24f2045c-8355-4a2c-9957-520347d789f3","Type":"ContainerDied","Data":"f5833b85c783d4a1136116059dccc886bce76bfab20c696386eb0416bd8c7db7"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.722455 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"818d1929-2446-4ce6-80ec-6ed3fdec2b3d","Type":"ContainerStarted","Data":"02ef3a951a9c055381dd8c0036f19a87b350e4fa2263880014b3c72a53f5284d"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.723687 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-drphb" event={"ID":"493d04a9-b969-4c11-bd84-a1e9d57b7772","Type":"ContainerStarted","Data":"2d565384ab95611c1b9eaac966bc0a49da62f5507720d6304ee10d8b241a121d"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.725071 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"99ef16ee-b18a-4374-9b14-0d6e08df5558","Type":"ContainerStarted","Data":"ea38be1c393501d813553db725d8fbf6b6c71ec1ec237758fe112999ff13200f"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.726317 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d1a511fd-4696-456a-8263-da4cd2f5eff1","Type":"ContainerStarted","Data":"d4202ac9b5fd5d7a9e4d48adcaad80437647eb8924a79f2d618a7d282debb486"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.727624 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" event={"ID":"4092032e-ef2c-430b-bded-a96402f6b6c8","Type":"ContainerDied","Data":"9e672a0c0166e61ff33af68398e3beb208e9363ebf4d198b0f424ecc69f2e685"} Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.727666 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jp65c" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.739471 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" podStartSLOduration=3.013423843 podStartE2EDuration="16.739454104s" podCreationTimestamp="2026-02-25 11:10:37 +0000 UTC" firstStartedPulling="2026-02-25 11:10:38.566305908 +0000 UTC m=+1064.064887933" lastFinishedPulling="2026-02-25 11:10:52.292336159 +0000 UTC m=+1077.790918194" observedRunningTime="2026-02-25 11:10:53.732898632 +0000 UTC m=+1079.231480677" watchObservedRunningTime="2026-02-25 11:10:53.739454104 +0000 UTC m=+1079.238036129" Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.764294 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jp65c"] Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.769557 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jp65c"] Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.794577 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pllrl"] Feb 25 11:10:53 crc kubenswrapper[4725]: I0225 11:10:53.802162 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pllrl"] Feb 25 11:10:55 crc kubenswrapper[4725]: I0225 11:10:55.241316 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f2045c-8355-4a2c-9957-520347d789f3" path="/var/lib/kubelet/pods/24f2045c-8355-4a2c-9957-520347d789f3/volumes" Feb 25 11:10:55 crc kubenswrapper[4725]: I0225 11:10:55.241935 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4092032e-ef2c-430b-bded-a96402f6b6c8" path="/var/lib/kubelet/pods/4092032e-ef2c-430b-bded-a96402f6b6c8/volumes" Feb 25 11:10:58 crc kubenswrapper[4725]: I0225 11:10:58.094576 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:10:58 crc kubenswrapper[4725]: I0225 11:10:58.178499 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4kxtq"] Feb 25 11:11:00 crc kubenswrapper[4725]: I0225 11:11:00.945083 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.059753 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv4h7\" (UniqueName: \"kubernetes.io/projected/15d73a7a-5441-4c0b-812e-f57ed4bdd594-kube-api-access-kv4h7\") pod \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.059856 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-dns-svc\") pod \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.059907 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-config\") pod \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\" (UID: \"15d73a7a-5441-4c0b-812e-f57ed4bdd594\") " Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.066789 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d73a7a-5441-4c0b-812e-f57ed4bdd594-kube-api-access-kv4h7" (OuterVolumeSpecName: "kube-api-access-kv4h7") pod "15d73a7a-5441-4c0b-812e-f57ed4bdd594" (UID: "15d73a7a-5441-4c0b-812e-f57ed4bdd594"). InnerVolumeSpecName "kube-api-access-kv4h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.080933 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15d73a7a-5441-4c0b-812e-f57ed4bdd594" (UID: "15d73a7a-5441-4c0b-812e-f57ed4bdd594"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.101092 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-config" (OuterVolumeSpecName: "config") pod "15d73a7a-5441-4c0b-812e-f57ed4bdd594" (UID: "15d73a7a-5441-4c0b-812e-f57ed4bdd594"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.161763 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv4h7\" (UniqueName: \"kubernetes.io/projected/15d73a7a-5441-4c0b-812e-f57ed4bdd594-kube-api-access-kv4h7\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.161802 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.161812 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d73a7a-5441-4c0b-812e-f57ed4bdd594-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.785476 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" event={"ID":"15d73a7a-5441-4c0b-812e-f57ed4bdd594","Type":"ContainerDied","Data":"16de003be0a226da33b6691f831130ec508b1822789aeada71dd042e63bf8116"} Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.785512 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4kxtq" Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.830247 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4kxtq"] Feb 25 11:11:01 crc kubenswrapper[4725]: I0225 11:11:01.837046 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4kxtq"] Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.797086 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6c23a18-36cf-4d71-885d-f2b93ba16375","Type":"ContainerStarted","Data":"9257bfa8d559072148c6b97c1ee185bedeb2e2c6ef6ab65b86ba8b8049154aca"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.799752 4725 generic.go:334] "Generic (PLEG): container finished" podID="493d04a9-b969-4c11-bd84-a1e9d57b7772" containerID="7bf6b5875c586bb199cc03952a790d9cbb94442d06ea177a380a89ee4c4e1908" exitCode=0 Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.799814 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-drphb" event={"ID":"493d04a9-b969-4c11-bd84-a1e9d57b7772","Type":"ContainerDied","Data":"7bf6b5875c586bb199cc03952a790d9cbb94442d06ea177a380a89ee4c4e1908"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.801530 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xpvnr" event={"ID":"d2445fb4-75ca-4ea2-b979-5757105279ab","Type":"ContainerStarted","Data":"9bdabd2c1c4f415bdce64be6e0855cc22ff21cfee425fda111ce99f22a0792d6"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.801752 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xpvnr" Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.807401 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a30e3088-499a-491e-a9b0-65e54ac709c9","Type":"ContainerStarted","Data":"3a57b34ccc717e13153095cbebcb8c2d70986c0bc40d118be936ba2c47d7aaaf"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.807530 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.812751 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24e787b7-ef1d-4c61-b01a-f8119d7911c0","Type":"ContainerStarted","Data":"d104df32e9b0d3763a42c31dab0c598e07a5a19dd3b026eace05d93aa9ade43d"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.821725 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"818d1929-2446-4ce6-80ec-6ed3fdec2b3d","Type":"ContainerStarted","Data":"de83add029aa422ab3fa45ce1dc9cee6ae08689d45f41cf5bcaef1d1c531bf5f"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.827077 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"99ef16ee-b18a-4374-9b14-0d6e08df5558","Type":"ContainerStarted","Data":"da587342c08b81ddacff579f4531cd8b4babd91a5834f4c086f8e6a5c37df392"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.830063 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d1a511fd-4696-456a-8263-da4cd2f5eff1","Type":"ContainerStarted","Data":"6d69b6d7376a54b89e12188a0e9f6681be6c795c0ea23114c746f56b5175501a"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.832596 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b330a7b3-8fd7-4db6-8d82-257570b2bd58","Type":"ContainerStarted","Data":"62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de"} Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.832876 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.850186 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.918354781 podStartE2EDuration="21.850170869s" podCreationTimestamp="2026-02-25 11:10:41 +0000 UTC" firstStartedPulling="2026-02-25 11:10:52.929369123 +0000 UTC m=+1078.427951148" lastFinishedPulling="2026-02-25 11:11:00.861185211 +0000 UTC m=+1086.359767236" observedRunningTime="2026-02-25 11:11:02.847675353 +0000 UTC m=+1088.346257378" watchObservedRunningTime="2026-02-25 11:11:02.850170869 +0000 UTC m=+1088.348752894" Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.867231 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xpvnr" podStartSLOduration=7.160008179 podStartE2EDuration="15.867216746s" podCreationTimestamp="2026-02-25 11:10:47 +0000 UTC" firstStartedPulling="2026-02-25 11:10:52.895977066 +0000 UTC m=+1078.394559091" lastFinishedPulling="2026-02-25 11:11:01.603185633 +0000 UTC m=+1087.101767658" observedRunningTime="2026-02-25 11:11:02.865934313 +0000 UTC m=+1088.364516348" watchObservedRunningTime="2026-02-25 11:11:02.867216746 +0000 UTC m=+1088.365798761" Feb 25 11:11:02 crc kubenswrapper[4725]: I0225 11:11:02.976013 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.312087596 podStartE2EDuration="18.975991414s" podCreationTimestamp="2026-02-25 11:10:44 +0000 UTC" firstStartedPulling="2026-02-25 11:10:53.08910177 +0000 UTC m=+1078.587683795" lastFinishedPulling="2026-02-25 11:11:01.753005598 +0000 UTC m=+1087.251587613" observedRunningTime="2026-02-25 11:11:02.964194044 +0000 UTC m=+1088.462776069" watchObservedRunningTime="2026-02-25 11:11:02.975991414 +0000 UTC m=+1088.474573449" Feb 25 11:11:03 crc kubenswrapper[4725]: I0225 11:11:03.237250 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d73a7a-5441-4c0b-812e-f57ed4bdd594" path="/var/lib/kubelet/pods/15d73a7a-5441-4c0b-812e-f57ed4bdd594/volumes" Feb 25 11:11:03 crc kubenswrapper[4725]: I0225 11:11:03.844981 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57e7a103-f119-4d8e-bb7f-96f36b66994e","Type":"ContainerStarted","Data":"65bb35575781bad2e98c04d4e1b97efb65e9db76bd69365abd39ee6385396cf2"} Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.857963 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"818d1929-2446-4ce6-80ec-6ed3fdec2b3d","Type":"ContainerStarted","Data":"1000b29b018d8ec8294261cd72227b2b13cd4b94d4ebc1923cbe6bbf14e4987f"} Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.862399 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-drphb" event={"ID":"493d04a9-b969-4c11-bd84-a1e9d57b7772","Type":"ContainerStarted","Data":"009e57450bd775f3a1a26c04be84623f325d1cb3c4941e9b239211775097fd85"} Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.862456 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-drphb" event={"ID":"493d04a9-b969-4c11-bd84-a1e9d57b7772","Type":"ContainerStarted","Data":"f6b0cdef275effd478419e45e64ae2a5aca2b3ecabdaaf3e3fc9aaf2bdda27dd"} Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.862513 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.862547 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.864975 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24e787b7-ef1d-4c61-b01a-f8119d7911c0","Type":"ContainerStarted","Data":"655a918e32c6c5c9800056fd056601f44b7d874ffa552879b10c9dd605556045"} Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.893430 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.893545 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.905097 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.28217761 podStartE2EDuration="14.905064008s" podCreationTimestamp="2026-02-25 11:10:50 +0000 UTC" firstStartedPulling="2026-02-25 11:10:53.155476493 +0000 UTC m=+1078.654058518" lastFinishedPulling="2026-02-25 11:11:03.778362891 +0000 UTC m=+1089.276944916" observedRunningTime="2026-02-25 11:11:04.887698112 +0000 UTC m=+1090.386280187" watchObservedRunningTime="2026-02-25 11:11:04.905064008 +0000 UTC m=+1090.403646073" Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.925949 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-drphb" podStartSLOduration=10.164551166 podStartE2EDuration="17.925928196s" podCreationTimestamp="2026-02-25 11:10:47 +0000 UTC" firstStartedPulling="2026-02-25 11:10:53.298810669 +0000 UTC m=+1078.797392694" lastFinishedPulling="2026-02-25 11:11:01.060187699 +0000 UTC m=+1086.558769724" observedRunningTime="2026-02-25 11:11:04.917919306 +0000 UTC m=+1090.416501361" watchObservedRunningTime="2026-02-25 11:11:04.925928196 +0000 UTC m=+1090.424510231" Feb 25 11:11:04 crc kubenswrapper[4725]: I0225 11:11:04.950205 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.157844455 podStartE2EDuration="16.950179403s" podCreationTimestamp="2026-02-25 11:10:48 +0000 UTC" firstStartedPulling="2026-02-25 11:10:52.988334873 +0000 UTC m=+1078.486916888" lastFinishedPulling="2026-02-25 11:11:03.780669801 +0000 UTC m=+1089.279251836" observedRunningTime="2026-02-25 11:11:04.94624896 +0000 UTC m=+1090.444831015" watchObservedRunningTime="2026-02-25 11:11:04.950179403 +0000 UTC m=+1090.448761468" Feb 25 11:11:05 crc kubenswrapper[4725]: I0225 11:11:05.878409 4725 generic.go:334] "Generic (PLEG): container finished" podID="a6c23a18-36cf-4d71-885d-f2b93ba16375" containerID="9257bfa8d559072148c6b97c1ee185bedeb2e2c6ef6ab65b86ba8b8049154aca" exitCode=0 Feb 25 11:11:05 crc kubenswrapper[4725]: I0225 11:11:05.878508 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6c23a18-36cf-4d71-885d-f2b93ba16375","Type":"ContainerDied","Data":"9257bfa8d559072148c6b97c1ee185bedeb2e2c6ef6ab65b86ba8b8049154aca"} Feb 25 11:11:05 crc kubenswrapper[4725]: I0225 11:11:05.883256 4725 generic.go:334] "Generic (PLEG): container finished" podID="99ef16ee-b18a-4374-9b14-0d6e08df5558" containerID="da587342c08b81ddacff579f4531cd8b4babd91a5834f4c086f8e6a5c37df392" exitCode=0 Feb 25 11:11:05 crc kubenswrapper[4725]: I0225 11:11:05.883337 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"99ef16ee-b18a-4374-9b14-0d6e08df5558","Type":"ContainerDied","Data":"da587342c08b81ddacff579f4531cd8b4babd91a5834f4c086f8e6a5c37df392"} Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.402919 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.403184 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.452523 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.898818 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"99ef16ee-b18a-4374-9b14-0d6e08df5558","Type":"ContainerStarted","Data":"0321bfb328147892d7da74e4dbfb6e4a687035182ff4bf04d56617a8c2f88a03"} Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.902596 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6c23a18-36cf-4d71-885d-f2b93ba16375","Type":"ContainerStarted","Data":"ad1a6c03cadc6bc944999bb142beb1813a5f2e542d0de9ca30d2f9f809cf56e9"} Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.945658 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.09474382 podStartE2EDuration="26.94563779s" podCreationTimestamp="2026-02-25 11:10:40 +0000 UTC" firstStartedPulling="2026-02-25 11:10:52.753989577 +0000 UTC m=+1078.252571602" lastFinishedPulling="2026-02-25 11:11:01.604883547 +0000 UTC m=+1087.103465572" observedRunningTime="2026-02-25 11:11:06.928977252 +0000 UTC m=+1092.427559287" watchObservedRunningTime="2026-02-25 11:11:06.94563779 +0000 UTC m=+1092.444219825" Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.959178 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.81273036 podStartE2EDuration="27.959143775s" podCreationTimestamp="2026-02-25 11:10:39 +0000 UTC" firstStartedPulling="2026-02-25 11:10:52.907211121 +0000 UTC m=+1078.405793146" lastFinishedPulling="2026-02-25 11:11:01.053624536 +0000 UTC m=+1086.552206561" observedRunningTime="2026-02-25 11:11:06.954670657 +0000 UTC m=+1092.453252722" watchObservedRunningTime="2026-02-25 11:11:06.959143775 +0000 UTC m=+1092.457725900" Feb 25 11:11:06 crc kubenswrapper[4725]: I0225 11:11:06.982747 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.198045 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.294620 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-frqfd"] Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.295763 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.298738 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.308396 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-frqfd"] Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.323916 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hlw77"] Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.324769 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.327069 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.351407 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hlw77"] Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372482 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82a07d0a-26d5-463c-95aa-eb022c49ac9d-ovn-rundir\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372527 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a07d0a-26d5-463c-95aa-eb022c49ac9d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372551 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372593 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82a07d0a-26d5-463c-95aa-eb022c49ac9d-ovs-rundir\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372613 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl78c\" (UniqueName: \"kubernetes.io/projected/82a07d0a-26d5-463c-95aa-eb022c49ac9d-kube-api-access-kl78c\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372644 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372664 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a07d0a-26d5-463c-95aa-eb022c49ac9d-combined-ca-bundle\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372692 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-config\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372728 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a07d0a-26d5-463c-95aa-eb022c49ac9d-config\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.372754 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9pb\" (UniqueName: \"kubernetes.io/projected/0f5cae3b-2eee-40e2-8155-3eb4218417ee-kube-api-access-bz9pb\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474218 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82a07d0a-26d5-463c-95aa-eb022c49ac9d-ovn-rundir\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474489 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a07d0a-26d5-463c-95aa-eb022c49ac9d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474512 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474553 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82a07d0a-26d5-463c-95aa-eb022c49ac9d-ovs-rundir\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474572 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl78c\" (UniqueName: \"kubernetes.io/projected/82a07d0a-26d5-463c-95aa-eb022c49ac9d-kube-api-access-kl78c\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474601 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474622 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a07d0a-26d5-463c-95aa-eb022c49ac9d-combined-ca-bundle\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474648 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-config\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474687 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a07d0a-26d5-463c-95aa-eb022c49ac9d-config\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.474707 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz9pb\" (UniqueName: \"kubernetes.io/projected/0f5cae3b-2eee-40e2-8155-3eb4218417ee-kube-api-access-bz9pb\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.475193 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82a07d0a-26d5-463c-95aa-eb022c49ac9d-ovn-rundir\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.476453 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-config\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.476516 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.477080 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.477144 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82a07d0a-26d5-463c-95aa-eb022c49ac9d-ovs-rundir\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.477855 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a07d0a-26d5-463c-95aa-eb022c49ac9d-config\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.481312 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a07d0a-26d5-463c-95aa-eb022c49ac9d-combined-ca-bundle\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.482152 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a07d0a-26d5-463c-95aa-eb022c49ac9d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.501490 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz9pb\" (UniqueName: \"kubernetes.io/projected/0f5cae3b-2eee-40e2-8155-3eb4218417ee-kube-api-access-bz9pb\") pod \"dnsmasq-dns-7f896c8c65-frqfd\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.521432 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl78c\" (UniqueName: \"kubernetes.io/projected/82a07d0a-26d5-463c-95aa-eb022c49ac9d-kube-api-access-kl78c\") pod \"ovn-controller-metrics-hlw77\" (UID: \"82a07d0a-26d5-463c-95aa-eb022c49ac9d\") " pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.611683 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.640801 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hlw77" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.786567 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-frqfd"] Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.821604 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4gcr6"] Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.822717 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.827499 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.831682 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4gcr6"] Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.902983 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhkg\" (UniqueName: \"kubernetes.io/projected/163a1e93-3ba4-4a36-b01a-c3045f3d1311-kube-api-access-6jhkg\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.903052 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.903069 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-config\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.903128 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.903148 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.937610 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 25 11:11:07 crc kubenswrapper[4725]: I0225 11:11:07.982448 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.006424 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.007902 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.008006 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.008131 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhkg\" (UniqueName: \"kubernetes.io/projected/163a1e93-3ba4-4a36-b01a-c3045f3d1311-kube-api-access-6jhkg\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.008281 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.008308 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-config\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.010275 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.010527 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-config\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.011287 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.027631 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhkg\" (UniqueName: \"kubernetes.io/projected/163a1e93-3ba4-4a36-b01a-c3045f3d1311-kube-api-access-6jhkg\") pod \"dnsmasq-dns-86db49b7ff-4gcr6\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.147069 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-frqfd"] Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.157983 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.216743 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hlw77"] Feb 25 11:11:08 crc kubenswrapper[4725]: W0225 11:11:08.236431 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82a07d0a_26d5_463c_95aa_eb022c49ac9d.slice/crio-6e82dab4ecd03dd4f7a834486d1e6ac008694e1519b901f477adde0578a15a9f WatchSource:0}: Error finding container 6e82dab4ecd03dd4f7a834486d1e6ac008694e1519b901f477adde0578a15a9f: Status 404 returned error can't find the container with id 6e82dab4ecd03dd4f7a834486d1e6ac008694e1519b901f477adde0578a15a9f Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.286118 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.299306 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.304051 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.304709 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.304972 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.305113 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.305289 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6p2df" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.414116 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblx9\" (UniqueName: \"kubernetes.io/projected/254996fe-9d34-46de-8e63-d4762c639a24-kube-api-access-lblx9\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.417109 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/254996fe-9d34-46de-8e63-d4762c639a24-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.417168 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.417231 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/254996fe-9d34-46de-8e63-d4762c639a24-scripts\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.417320 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.417350 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.417413 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254996fe-9d34-46de-8e63-d4762c639a24-config\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.519615 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254996fe-9d34-46de-8e63-d4762c639a24-config\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.519735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lblx9\" (UniqueName: \"kubernetes.io/projected/254996fe-9d34-46de-8e63-d4762c639a24-kube-api-access-lblx9\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.519771 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/254996fe-9d34-46de-8e63-d4762c639a24-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.519800 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.519852 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/254996fe-9d34-46de-8e63-d4762c639a24-scripts\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.519903 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.519927 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.520332 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/254996fe-9d34-46de-8e63-d4762c639a24-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.520710 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/254996fe-9d34-46de-8e63-d4762c639a24-scripts\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.521564 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254996fe-9d34-46de-8e63-d4762c639a24-config\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.524587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.525343 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.526377 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/254996fe-9d34-46de-8e63-d4762c639a24-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.540742 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lblx9\" (UniqueName: \"kubernetes.io/projected/254996fe-9d34-46de-8e63-d4762c639a24-kube-api-access-lblx9\") pod \"ovn-northd-0\" (UID: \"254996fe-9d34-46de-8e63-d4762c639a24\") " pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.626934 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.645142 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4gcr6"] Feb 25 11:11:08 crc kubenswrapper[4725]: W0225 11:11:08.655854 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod163a1e93_3ba4_4a36_b01a_c3045f3d1311.slice/crio-bfa8d4e8c9149e2ae314982b47b0b5969883fb17d1d7240a8023da342e49cd6e WatchSource:0}: Error finding container bfa8d4e8c9149e2ae314982b47b0b5969883fb17d1d7240a8023da342e49cd6e: Status 404 returned error can't find the container with id bfa8d4e8c9149e2ae314982b47b0b5969883fb17d1d7240a8023da342e49cd6e Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.925063 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hlw77" event={"ID":"82a07d0a-26d5-463c-95aa-eb022c49ac9d","Type":"ContainerStarted","Data":"260af136368393cdce5bbc2f1ea8a712683ad8ddeba9c536eb6a02a94629bc10"} Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.925107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hlw77" event={"ID":"82a07d0a-26d5-463c-95aa-eb022c49ac9d","Type":"ContainerStarted","Data":"6e82dab4ecd03dd4f7a834486d1e6ac008694e1519b901f477adde0578a15a9f"} Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.927222 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" event={"ID":"163a1e93-3ba4-4a36-b01a-c3045f3d1311","Type":"ContainerStarted","Data":"2c4bdd9972c34c93975e2b063e82c26a16e4d85533ec635717808efc939426ec"} Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.927275 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" event={"ID":"163a1e93-3ba4-4a36-b01a-c3045f3d1311","Type":"ContainerStarted","Data":"bfa8d4e8c9149e2ae314982b47b0b5969883fb17d1d7240a8023da342e49cd6e"} Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.930140 4725 generic.go:334] "Generic (PLEG): container finished" podID="0f5cae3b-2eee-40e2-8155-3eb4218417ee" containerID="84933ff6b59535d5cb2dcb95e89dd830ad707b64647cc45dd71406b7329f249a" exitCode=0 Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.930176 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" event={"ID":"0f5cae3b-2eee-40e2-8155-3eb4218417ee","Type":"ContainerDied","Data":"84933ff6b59535d5cb2dcb95e89dd830ad707b64647cc45dd71406b7329f249a"} Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.930209 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" event={"ID":"0f5cae3b-2eee-40e2-8155-3eb4218417ee","Type":"ContainerStarted","Data":"c479cdea2074ad1ff9c4f9fad051bbc5068c282ad20098992d75df4dde4b9e9e"} Feb 25 11:11:08 crc kubenswrapper[4725]: I0225 11:11:08.943753 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hlw77" podStartSLOduration=1.943739478 podStartE2EDuration="1.943739478s" podCreationTimestamp="2026-02-25 11:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:08.942737151 +0000 UTC m=+1094.441319186" watchObservedRunningTime="2026-02-25 11:11:08.943739478 +0000 UTC m=+1094.442321503" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.049788 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 25 11:11:09 crc kubenswrapper[4725]: E0225 11:11:09.050005 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod163a1e93_3ba4_4a36_b01a_c3045f3d1311.slice/crio-2c4bdd9972c34c93975e2b063e82c26a16e4d85533ec635717808efc939426ec.scope\": RecentStats: unable to find data in memory cache]" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.196152 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.234595 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-config\") pod \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.234957 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz9pb\" (UniqueName: \"kubernetes.io/projected/0f5cae3b-2eee-40e2-8155-3eb4218417ee-kube-api-access-bz9pb\") pod \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.234994 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-dns-svc\") pod \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.235025 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-ovsdbserver-sb\") pod \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\" (UID: \"0f5cae3b-2eee-40e2-8155-3eb4218417ee\") " Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.239011 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5cae3b-2eee-40e2-8155-3eb4218417ee-kube-api-access-bz9pb" (OuterVolumeSpecName: "kube-api-access-bz9pb") pod "0f5cae3b-2eee-40e2-8155-3eb4218417ee" (UID: "0f5cae3b-2eee-40e2-8155-3eb4218417ee"). InnerVolumeSpecName "kube-api-access-bz9pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.258539 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f5cae3b-2eee-40e2-8155-3eb4218417ee" (UID: "0f5cae3b-2eee-40e2-8155-3eb4218417ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.258549 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-config" (OuterVolumeSpecName: "config") pod "0f5cae3b-2eee-40e2-8155-3eb4218417ee" (UID: "0f5cae3b-2eee-40e2-8155-3eb4218417ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.259351 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f5cae3b-2eee-40e2-8155-3eb4218417ee" (UID: "0f5cae3b-2eee-40e2-8155-3eb4218417ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.336525 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.336552 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz9pb\" (UniqueName: \"kubernetes.io/projected/0f5cae3b-2eee-40e2-8155-3eb4218417ee-kube-api-access-bz9pb\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.336562 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.336570 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5cae3b-2eee-40e2-8155-3eb4218417ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.939440 4725 generic.go:334] "Generic (PLEG): container finished" podID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerID="2c4bdd9972c34c93975e2b063e82c26a16e4d85533ec635717808efc939426ec" exitCode=0 Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.939551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" event={"ID":"163a1e93-3ba4-4a36-b01a-c3045f3d1311","Type":"ContainerDied","Data":"2c4bdd9972c34c93975e2b063e82c26a16e4d85533ec635717808efc939426ec"} Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.946692 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" event={"ID":"0f5cae3b-2eee-40e2-8155-3eb4218417ee","Type":"ContainerDied","Data":"c479cdea2074ad1ff9c4f9fad051bbc5068c282ad20098992d75df4dde4b9e9e"} Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.946747 4725 scope.go:117] "RemoveContainer" containerID="84933ff6b59535d5cb2dcb95e89dd830ad707b64647cc45dd71406b7329f249a" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.947062 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-frqfd" Feb 25 11:11:09 crc kubenswrapper[4725]: I0225 11:11:09.948692 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"254996fe-9d34-46de-8e63-d4762c639a24","Type":"ContainerStarted","Data":"5dd4bce3088898f811d59cbd857158e8fcac713f1c6d53805224ea33ca742499"} Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.146877 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-frqfd"] Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.152429 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-frqfd"] Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.643478 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.643515 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.729898 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.957019 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" event={"ID":"163a1e93-3ba4-4a36-b01a-c3045f3d1311","Type":"ContainerStarted","Data":"8ceec7ced06476e334fd1c2fe2c8b571a54c9966a7f60b27504a8f657d13edbd"} Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.957380 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.959988 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"254996fe-9d34-46de-8e63-d4762c639a24","Type":"ContainerStarted","Data":"75aad163c33ed34b8cd329a3b03c136070bbfc2af5785687a940459d71f60dbc"} Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.960016 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"254996fe-9d34-46de-8e63-d4762c639a24","Type":"ContainerStarted","Data":"8fe5ee78c3ecaf526f7f4ae6aab344a625c66688123e993e9fc5ca2a1b487ef8"} Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.960222 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 25 11:11:10 crc kubenswrapper[4725]: I0225 11:11:10.978050 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" podStartSLOduration=3.978032415 podStartE2EDuration="3.978032415s" podCreationTimestamp="2026-02-25 11:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:10.973728842 +0000 UTC m=+1096.472310877" watchObservedRunningTime="2026-02-25 11:11:10.978032415 +0000 UTC m=+1096.476614440" Feb 25 11:11:11 crc kubenswrapper[4725]: I0225 11:11:11.001797 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.04091929 podStartE2EDuration="3.001778939s" podCreationTimestamp="2026-02-25 11:11:08 +0000 UTC" firstStartedPulling="2026-02-25 11:11:09.057186618 +0000 UTC m=+1094.555768643" lastFinishedPulling="2026-02-25 11:11:10.018046267 +0000 UTC m=+1095.516628292" observedRunningTime="2026-02-25 11:11:10.998962155 +0000 UTC m=+1096.497544190" watchObservedRunningTime="2026-02-25 11:11:11.001778939 +0000 UTC m=+1096.500360964" Feb 25 11:11:11 crc kubenswrapper[4725]: I0225 11:11:11.050202 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 25 11:11:11 crc kubenswrapper[4725]: I0225 11:11:11.239332 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5cae3b-2eee-40e2-8155-3eb4218417ee" path="/var/lib/kubelet/pods/0f5cae3b-2eee-40e2-8155-3eb4218417ee/volumes" Feb 25 11:11:11 crc kubenswrapper[4725]: I0225 11:11:11.898402 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 25 11:11:11 crc kubenswrapper[4725]: I0225 11:11:11.898499 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.248919 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4cc8-account-create-update-gxqmd"] Feb 25 11:11:13 crc kubenswrapper[4725]: E0225 11:11:13.249493 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5cae3b-2eee-40e2-8155-3eb4218417ee" containerName="init" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.249505 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5cae3b-2eee-40e2-8155-3eb4218417ee" containerName="init" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.249646 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5cae3b-2eee-40e2-8155-3eb4218417ee" containerName="init" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.250171 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.256998 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.264534 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4cc8-account-create-update-gxqmd"] Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.302362 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-kzkj5"] Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.303476 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.307029 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8094412c-eb55-4366-a7a2-0bd29cff2983-operator-scripts\") pod \"keystone-4cc8-account-create-update-gxqmd\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.307127 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqp6p\" (UniqueName: \"kubernetes.io/projected/8094412c-eb55-4366-a7a2-0bd29cff2983-kube-api-access-gqp6p\") pod \"keystone-4cc8-account-create-update-gxqmd\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.308467 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kzkj5"] Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.398316 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-bb4jr"] Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.399701 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.408890 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ppzw\" (UniqueName: \"kubernetes.io/projected/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-kube-api-access-7ppzw\") pod \"keystone-db-create-kzkj5\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.409122 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-operator-scripts\") pod \"keystone-db-create-kzkj5\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.409223 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8094412c-eb55-4366-a7a2-0bd29cff2983-operator-scripts\") pod \"keystone-4cc8-account-create-update-gxqmd\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.409345 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2ef8b2-0d39-411c-b91a-a396aa246f66-operator-scripts\") pod \"placement-db-create-bb4jr\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.409437 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbltm\" (UniqueName: \"kubernetes.io/projected/ad2ef8b2-0d39-411c-b91a-a396aa246f66-kube-api-access-xbltm\") pod \"placement-db-create-bb4jr\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.409543 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqp6p\" (UniqueName: \"kubernetes.io/projected/8094412c-eb55-4366-a7a2-0bd29cff2983-kube-api-access-gqp6p\") pod \"keystone-4cc8-account-create-update-gxqmd\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.410120 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8094412c-eb55-4366-a7a2-0bd29cff2983-operator-scripts\") pod \"keystone-4cc8-account-create-update-gxqmd\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.410884 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bb4jr"] Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.441326 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqp6p\" (UniqueName: \"kubernetes.io/projected/8094412c-eb55-4366-a7a2-0bd29cff2983-kube-api-access-gqp6p\") pod \"keystone-4cc8-account-create-update-gxqmd\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.468413 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fdca-account-create-update-998xh"] Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.470000 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.471765 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.481876 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fdca-account-create-update-998xh"] Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.513373 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2ef8b2-0d39-411c-b91a-a396aa246f66-operator-scripts\") pod \"placement-db-create-bb4jr\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.513436 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbltm\" (UniqueName: \"kubernetes.io/projected/ad2ef8b2-0d39-411c-b91a-a396aa246f66-kube-api-access-xbltm\") pod \"placement-db-create-bb4jr\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.513478 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5586\" (UniqueName: \"kubernetes.io/projected/422d7ab0-0190-46dc-976e-e827bb7b48e8-kube-api-access-x5586\") pod \"placement-fdca-account-create-update-998xh\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.513566 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/422d7ab0-0190-46dc-976e-e827bb7b48e8-operator-scripts\") pod \"placement-fdca-account-create-update-998xh\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.513612 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ppzw\" (UniqueName: \"kubernetes.io/projected/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-kube-api-access-7ppzw\") pod \"keystone-db-create-kzkj5\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.513650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-operator-scripts\") pod \"keystone-db-create-kzkj5\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.514387 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-operator-scripts\") pod \"keystone-db-create-kzkj5\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.514909 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2ef8b2-0d39-411c-b91a-a396aa246f66-operator-scripts\") pod \"placement-db-create-bb4jr\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.533273 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbltm\" (UniqueName: \"kubernetes.io/projected/ad2ef8b2-0d39-411c-b91a-a396aa246f66-kube-api-access-xbltm\") pod \"placement-db-create-bb4jr\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.537328 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ppzw\" (UniqueName: \"kubernetes.io/projected/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-kube-api-access-7ppzw\") pod \"keystone-db-create-kzkj5\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.571363 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.615155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5586\" (UniqueName: \"kubernetes.io/projected/422d7ab0-0190-46dc-976e-e827bb7b48e8-kube-api-access-x5586\") pod \"placement-fdca-account-create-update-998xh\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.615230 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/422d7ab0-0190-46dc-976e-e827bb7b48e8-operator-scripts\") pod \"placement-fdca-account-create-update-998xh\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.615956 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/422d7ab0-0190-46dc-976e-e827bb7b48e8-operator-scripts\") pod \"placement-fdca-account-create-update-998xh\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.622989 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.639813 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5586\" (UniqueName: \"kubernetes.io/projected/422d7ab0-0190-46dc-976e-e827bb7b48e8-kube-api-access-x5586\") pod \"placement-fdca-account-create-update-998xh\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.732644 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:13 crc kubenswrapper[4725]: I0225 11:11:13.793916 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.013880 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4cc8-account-create-update-gxqmd"] Feb 25 11:11:14 crc kubenswrapper[4725]: W0225 11:11:14.017495 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8094412c_eb55_4366_a7a2_0bd29cff2983.slice/crio-2bca8d5148d75e3894cb9d00bba51ecd5751038adbf410cba0c2213853fe63f8 WatchSource:0}: Error finding container 2bca8d5148d75e3894cb9d00bba51ecd5751038adbf410cba0c2213853fe63f8: Status 404 returned error can't find the container with id 2bca8d5148d75e3894cb9d00bba51ecd5751038adbf410cba0c2213853fe63f8 Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.146206 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kzkj5"] Feb 25 11:11:14 crc kubenswrapper[4725]: W0225 11:11:14.146627 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc03bd2e_9d03_4ff9_ba01_c24bd7c00b09.slice/crio-7afcf9bf09a30ca262d30af044b31fe6381665ef566d1ec185af3104500cead9 WatchSource:0}: Error finding container 7afcf9bf09a30ca262d30af044b31fe6381665ef566d1ec185af3104500cead9: Status 404 returned error can't find the container with id 7afcf9bf09a30ca262d30af044b31fe6381665ef566d1ec185af3104500cead9 Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.247551 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bb4jr"] Feb 25 11:11:14 crc kubenswrapper[4725]: W0225 11:11:14.252810 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad2ef8b2_0d39_411c_b91a_a396aa246f66.slice/crio-e9f278e086e5bd35614a484e84331dbae5e95b24a6fa33cf1854e2b3ede4466b WatchSource:0}: Error finding container e9f278e086e5bd35614a484e84331dbae5e95b24a6fa33cf1854e2b3ede4466b: Status 404 returned error can't find the container with id e9f278e086e5bd35614a484e84331dbae5e95b24a6fa33cf1854e2b3ede4466b Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.263130 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.320102 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fdca-account-create-update-998xh"] Feb 25 11:11:14 crc kubenswrapper[4725]: W0225 11:11:14.333577 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod422d7ab0_0190_46dc_976e_e827bb7b48e8.slice/crio-9f2c6fa465b059ce9f1437d2b9b3b40585cb7823f8fe19510f4b9c82bbc4e58e WatchSource:0}: Error finding container 9f2c6fa465b059ce9f1437d2b9b3b40585cb7823f8fe19510f4b9c82bbc4e58e: Status 404 returned error can't find the container with id 9f2c6fa465b059ce9f1437d2b9b3b40585cb7823f8fe19510f4b9c82bbc4e58e Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.368857 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.524997 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.571175 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4gcr6"] Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.576210 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" podUID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerName="dnsmasq-dns" containerID="cri-o://8ceec7ced06476e334fd1c2fe2c8b571a54c9966a7f60b27504a8f657d13edbd" gracePeriod=10 Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.604420 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-7frxh"] Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.614374 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.621005 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7frxh"] Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.632551 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.632646 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-config\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.632684 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.632710 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2zlv\" (UniqueName: \"kubernetes.io/projected/9c296aab-4223-43bb-a032-45b20ffeaab5-kube-api-access-v2zlv\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.632747 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-dns-svc\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.733793 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-dns-svc\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.733891 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.734001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-config\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.734037 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.734071 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zlv\" (UniqueName: \"kubernetes.io/projected/9c296aab-4223-43bb-a032-45b20ffeaab5-kube-api-access-v2zlv\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.735013 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-dns-svc\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.735172 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-config\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.735393 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.735672 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.775748 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2zlv\" (UniqueName: \"kubernetes.io/projected/9c296aab-4223-43bb-a032-45b20ffeaab5-kube-api-access-v2zlv\") pod \"dnsmasq-dns-698758b865-7frxh\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:14 crc kubenswrapper[4725]: I0225 11:11:14.942117 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.007232 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4cc8-account-create-update-gxqmd" event={"ID":"8094412c-eb55-4366-a7a2-0bd29cff2983","Type":"ContainerStarted","Data":"6a05cc28b3b8bceca89cd70d0309955773352e37a6023bfc9dabdc1113178faf"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.007432 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4cc8-account-create-update-gxqmd" event={"ID":"8094412c-eb55-4366-a7a2-0bd29cff2983","Type":"ContainerStarted","Data":"2bca8d5148d75e3894cb9d00bba51ecd5751038adbf410cba0c2213853fe63f8"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.009452 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdca-account-create-update-998xh" event={"ID":"422d7ab0-0190-46dc-976e-e827bb7b48e8","Type":"ContainerStarted","Data":"619a4c29f7698049b8e454a562d218ae23d6b3401c83a252ce8839e13d98d54d"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.009500 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdca-account-create-update-998xh" event={"ID":"422d7ab0-0190-46dc-976e-e827bb7b48e8","Type":"ContainerStarted","Data":"9f2c6fa465b059ce9f1437d2b9b3b40585cb7823f8fe19510f4b9c82bbc4e58e"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.018910 4725 generic.go:334] "Generic (PLEG): container finished" podID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerID="8ceec7ced06476e334fd1c2fe2c8b571a54c9966a7f60b27504a8f657d13edbd" exitCode=0 Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.018986 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" event={"ID":"163a1e93-3ba4-4a36-b01a-c3045f3d1311","Type":"ContainerDied","Data":"8ceec7ced06476e334fd1c2fe2c8b571a54c9966a7f60b27504a8f657d13edbd"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.020582 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bb4jr" event={"ID":"ad2ef8b2-0d39-411c-b91a-a396aa246f66","Type":"ContainerStarted","Data":"5654d16a0de2872aaf5c7f754bc590b5077a16b7fbfd4f83af50aabaad603412"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.020606 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bb4jr" event={"ID":"ad2ef8b2-0d39-411c-b91a-a396aa246f66","Type":"ContainerStarted","Data":"e9f278e086e5bd35614a484e84331dbae5e95b24a6fa33cf1854e2b3ede4466b"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.027057 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kzkj5" event={"ID":"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09","Type":"ContainerStarted","Data":"3d3e86dc4494d1dc674a9e4d1bcf89efcc35c66a729e039eb01e34bf77a14955"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.027101 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kzkj5" event={"ID":"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09","Type":"ContainerStarted","Data":"7afcf9bf09a30ca262d30af044b31fe6381665ef566d1ec185af3104500cead9"} Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.037500 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-4cc8-account-create-update-gxqmd" podStartSLOduration=2.037478358 podStartE2EDuration="2.037478358s" podCreationTimestamp="2026-02-25 11:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:15.036059191 +0000 UTC m=+1100.534641216" watchObservedRunningTime="2026-02-25 11:11:15.037478358 +0000 UTC m=+1100.536060403" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.055673 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-bb4jr" podStartSLOduration=2.055654229 podStartE2EDuration="2.055654229s" podCreationTimestamp="2026-02-25 11:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:15.050329388 +0000 UTC m=+1100.548911433" watchObservedRunningTime="2026-02-25 11:11:15.055654229 +0000 UTC m=+1100.554236254" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.069219 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-kzkj5" podStartSLOduration=2.069204947 podStartE2EDuration="2.069204947s" podCreationTimestamp="2026-02-25 11:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:15.067464281 +0000 UTC m=+1100.566046306" watchObservedRunningTime="2026-02-25 11:11:15.069204947 +0000 UTC m=+1100.567786962" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.121760 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.147404 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fdca-account-create-update-998xh" podStartSLOduration=2.147382615 podStartE2EDuration="2.147382615s" podCreationTimestamp="2026-02-25 11:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:15.086947027 +0000 UTC m=+1100.585529062" watchObservedRunningTime="2026-02-25 11:11:15.147382615 +0000 UTC m=+1100.645964640" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.249327 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-config\") pod \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.249473 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-nb\") pod \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.249564 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jhkg\" (UniqueName: \"kubernetes.io/projected/163a1e93-3ba4-4a36-b01a-c3045f3d1311-kube-api-access-6jhkg\") pod \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.249588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-sb\") pod \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.249633 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-dns-svc\") pod \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\" (UID: \"163a1e93-3ba4-4a36-b01a-c3045f3d1311\") " Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.253454 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/163a1e93-3ba4-4a36-b01a-c3045f3d1311-kube-api-access-6jhkg" (OuterVolumeSpecName: "kube-api-access-6jhkg") pod "163a1e93-3ba4-4a36-b01a-c3045f3d1311" (UID: "163a1e93-3ba4-4a36-b01a-c3045f3d1311"). InnerVolumeSpecName "kube-api-access-6jhkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.284659 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "163a1e93-3ba4-4a36-b01a-c3045f3d1311" (UID: "163a1e93-3ba4-4a36-b01a-c3045f3d1311"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.285847 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-config" (OuterVolumeSpecName: "config") pod "163a1e93-3ba4-4a36-b01a-c3045f3d1311" (UID: "163a1e93-3ba4-4a36-b01a-c3045f3d1311"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.285890 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "163a1e93-3ba4-4a36-b01a-c3045f3d1311" (UID: "163a1e93-3ba4-4a36-b01a-c3045f3d1311"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.286771 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "163a1e93-3ba4-4a36-b01a-c3045f3d1311" (UID: "163a1e93-3ba4-4a36-b01a-c3045f3d1311"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.351527 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jhkg\" (UniqueName: \"kubernetes.io/projected/163a1e93-3ba4-4a36-b01a-c3045f3d1311-kube-api-access-6jhkg\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.351562 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.351571 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.351579 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.351590 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163a1e93-3ba4-4a36-b01a-c3045f3d1311-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.416504 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7frxh"] Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.787673 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 25 11:11:15 crc kubenswrapper[4725]: E0225 11:11:15.789141 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerName="init" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.789177 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerName="init" Feb 25 11:11:15 crc kubenswrapper[4725]: E0225 11:11:15.789224 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerName="dnsmasq-dns" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.789238 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerName="dnsmasq-dns" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.790081 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" containerName="dnsmasq-dns" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.824520 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.827741 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2wpxf" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.827923 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.828039 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.827952 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.829355 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.962040 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcfrm\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-kube-api-access-rcfrm\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.962199 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d922deba-d455-45a7-ade3-dc2f588617bc-lock\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.962320 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d922deba-d455-45a7-ade3-dc2f588617bc-cache\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.962422 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922deba-d455-45a7-ade3-dc2f588617bc-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.962764 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:15 crc kubenswrapper[4725]: I0225 11:11:15.963001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.038255 4725 generic.go:334] "Generic (PLEG): container finished" podID="bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09" containerID="3d3e86dc4494d1dc674a9e4d1bcf89efcc35c66a729e039eb01e34bf77a14955" exitCode=0 Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.038682 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kzkj5" event={"ID":"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09","Type":"ContainerDied","Data":"3d3e86dc4494d1dc674a9e4d1bcf89efcc35c66a729e039eb01e34bf77a14955"} Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.042663 4725 generic.go:334] "Generic (PLEG): container finished" podID="422d7ab0-0190-46dc-976e-e827bb7b48e8" containerID="619a4c29f7698049b8e454a562d218ae23d6b3401c83a252ce8839e13d98d54d" exitCode=0 Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.042755 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdca-account-create-update-998xh" event={"ID":"422d7ab0-0190-46dc-976e-e827bb7b48e8","Type":"ContainerDied","Data":"619a4c29f7698049b8e454a562d218ae23d6b3401c83a252ce8839e13d98d54d"} Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.051126 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.050981 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4gcr6" event={"ID":"163a1e93-3ba4-4a36-b01a-c3045f3d1311","Type":"ContainerDied","Data":"bfa8d4e8c9149e2ae314982b47b0b5969883fb17d1d7240a8023da342e49cd6e"} Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.052088 4725 scope.go:117] "RemoveContainer" containerID="8ceec7ced06476e334fd1c2fe2c8b571a54c9966a7f60b27504a8f657d13edbd" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.061596 4725 generic.go:334] "Generic (PLEG): container finished" podID="ad2ef8b2-0d39-411c-b91a-a396aa246f66" containerID="5654d16a0de2872aaf5c7f754bc590b5077a16b7fbfd4f83af50aabaad603412" exitCode=0 Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.061699 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bb4jr" event={"ID":"ad2ef8b2-0d39-411c-b91a-a396aa246f66","Type":"ContainerDied","Data":"5654d16a0de2872aaf5c7f754bc590b5077a16b7fbfd4f83af50aabaad603412"} Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.064329 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.064433 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.064524 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcfrm\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-kube-api-access-rcfrm\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: E0225 11:11:16.064610 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 11:11:16 crc kubenswrapper[4725]: E0225 11:11:16.064661 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 11:11:16 crc kubenswrapper[4725]: E0225 11:11:16.064740 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift podName:d922deba-d455-45a7-ade3-dc2f588617bc nodeName:}" failed. No retries permitted until 2026-02-25 11:11:16.564712343 +0000 UTC m=+1102.063294408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift") pod "swift-storage-0" (UID: "d922deba-d455-45a7-ade3-dc2f588617bc") : configmap "swift-ring-files" not found Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.064635 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d922deba-d455-45a7-ade3-dc2f588617bc-lock\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.065239 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d922deba-d455-45a7-ade3-dc2f588617bc-cache\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.065390 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922deba-d455-45a7-ade3-dc2f588617bc-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.065469 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d922deba-d455-45a7-ade3-dc2f588617bc-lock\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.065889 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.066271 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d922deba-d455-45a7-ade3-dc2f588617bc-cache\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.067493 4725 generic.go:334] "Generic (PLEG): container finished" podID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerID="2af8e6e7a562b7ee4af79f43dae1477fc3030c2af6ebca94387b740f4bd7db9a" exitCode=0 Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.067583 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7frxh" event={"ID":"9c296aab-4223-43bb-a032-45b20ffeaab5","Type":"ContainerDied","Data":"2af8e6e7a562b7ee4af79f43dae1477fc3030c2af6ebca94387b740f4bd7db9a"} Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.067612 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7frxh" event={"ID":"9c296aab-4223-43bb-a032-45b20ffeaab5","Type":"ContainerStarted","Data":"9675fdc0601c6c560e3010d592a48d6d395e495f27c1ebf599497090ecff60de"} Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.073052 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4cc8-account-create-update-gxqmd" event={"ID":"8094412c-eb55-4366-a7a2-0bd29cff2983","Type":"ContainerDied","Data":"6a05cc28b3b8bceca89cd70d0309955773352e37a6023bfc9dabdc1113178faf"} Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.073049 4725 generic.go:334] "Generic (PLEG): container finished" podID="8094412c-eb55-4366-a7a2-0bd29cff2983" containerID="6a05cc28b3b8bceca89cd70d0309955773352e37a6023bfc9dabdc1113178faf" exitCode=0 Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.076158 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d922deba-d455-45a7-ade3-dc2f588617bc-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.098035 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcfrm\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-kube-api-access-rcfrm\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.119349 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.167341 4725 scope.go:117] "RemoveContainer" containerID="2c4bdd9972c34c93975e2b063e82c26a16e4d85533ec635717808efc939426ec" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.181657 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4gcr6"] Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.189944 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4gcr6"] Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.276649 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mzr6j"] Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.277615 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.280001 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.280174 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.280544 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.297266 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mzr6j"] Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.328334 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-mzr6j"] Feb 25 11:11:16 crc kubenswrapper[4725]: E0225 11:11:16.329175 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-rcvjj ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-rcvjj ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-mzr6j" podUID="a45604ea-1cc6-41ef-a08a-5f535a8c79df" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.336205 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-zc6sk"] Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.337466 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.354986 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zc6sk"] Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.478973 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-swiftconf\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.479032 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-scripts\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.479062 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-ring-data-devices\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.479095 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-swiftconf\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.479121 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-combined-ca-bundle\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.479741 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-dispersionconf\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.479962 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-dispersionconf\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.480158 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zfm8\" (UniqueName: \"kubernetes.io/projected/c5574881-8546-456a-96b2-d58158e8a447-kube-api-access-5zfm8\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.480193 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-scripts\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.480275 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5574881-8546-456a-96b2-d58158e8a447-etc-swift\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.480316 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-ring-data-devices\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.480342 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvjj\" (UniqueName: \"kubernetes.io/projected/a45604ea-1cc6-41ef-a08a-5f535a8c79df-kube-api-access-rcvjj\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.480366 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-combined-ca-bundle\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.480476 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a45604ea-1cc6-41ef-a08a-5f535a8c79df-etc-swift\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582049 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-combined-ca-bundle\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582108 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-dispersionconf\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582156 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-dispersionconf\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582221 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582272 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zfm8\" (UniqueName: \"kubernetes.io/projected/c5574881-8546-456a-96b2-d58158e8a447-kube-api-access-5zfm8\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582296 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-scripts\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582334 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5574881-8546-456a-96b2-d58158e8a447-etc-swift\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582367 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-ring-data-devices\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582389 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvjj\" (UniqueName: \"kubernetes.io/projected/a45604ea-1cc6-41ef-a08a-5f535a8c79df-kube-api-access-rcvjj\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582417 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-combined-ca-bundle\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a45604ea-1cc6-41ef-a08a-5f535a8c79df-etc-swift\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582497 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-swiftconf\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582530 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-scripts\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-ring-data-devices\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.582582 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-swiftconf\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.583342 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5574881-8546-456a-96b2-d58158e8a447-etc-swift\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.583442 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-scripts\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.583510 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-ring-data-devices\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.583686 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a45604ea-1cc6-41ef-a08a-5f535a8c79df-etc-swift\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: E0225 11:11:16.583882 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 11:11:16 crc kubenswrapper[4725]: E0225 11:11:16.583900 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 11:11:16 crc kubenswrapper[4725]: E0225 11:11:16.583943 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift podName:d922deba-d455-45a7-ade3-dc2f588617bc nodeName:}" failed. No retries permitted until 2026-02-25 11:11:17.583925562 +0000 UTC m=+1103.082507687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift") pod "swift-storage-0" (UID: "d922deba-d455-45a7-ade3-dc2f588617bc") : configmap "swift-ring-files" not found Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.584014 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-scripts\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.584032 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-ring-data-devices\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.587062 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-dispersionconf\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.588898 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-dispersionconf\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.588905 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-combined-ca-bundle\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.589361 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-swiftconf\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.589728 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-swiftconf\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.600675 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zfm8\" (UniqueName: \"kubernetes.io/projected/c5574881-8546-456a-96b2-d58158e8a447-kube-api-access-5zfm8\") pod \"swift-ring-rebalance-zc6sk\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.601567 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvjj\" (UniqueName: \"kubernetes.io/projected/a45604ea-1cc6-41ef-a08a-5f535a8c79df-kube-api-access-rcvjj\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.617343 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-combined-ca-bundle\") pod \"swift-ring-rebalance-mzr6j\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:16 crc kubenswrapper[4725]: I0225 11:11:16.652777 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.082718 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7frxh" event={"ID":"9c296aab-4223-43bb-a032-45b20ffeaab5","Type":"ContainerStarted","Data":"d03c3df2831f435eec79aa4c11fd77f615f21dd1d257dba5c76fc719b708a1de"} Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.083040 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.083215 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zc6sk"] Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.084368 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.096029 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:17 crc kubenswrapper[4725]: W0225 11:11:17.129513 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5574881_8546_456a_96b2_d58158e8a447.slice/crio-0a91002c464dd60b247c0190b87df27576dece593f38aec1787b9cb997cc09d1 WatchSource:0}: Error finding container 0a91002c464dd60b247c0190b87df27576dece593f38aec1787b9cb997cc09d1: Status 404 returned error can't find the container with id 0a91002c464dd60b247c0190b87df27576dece593f38aec1787b9cb997cc09d1 Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.134308 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-7frxh" podStartSLOduration=3.134286296 podStartE2EDuration="3.134286296s" podCreationTimestamp="2026-02-25 11:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:17.10643955 +0000 UTC m=+1102.605021585" watchObservedRunningTime="2026-02-25 11:11:17.134286296 +0000 UTC m=+1102.632868321" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.256669 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="163a1e93-3ba4-4a36-b01a-c3045f3d1311" path="/var/lib/kubelet/pods/163a1e93-3ba4-4a36-b01a-c3045f3d1311/volumes" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293252 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-scripts\") pod \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293322 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-combined-ca-bundle\") pod \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293716 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-ring-data-devices\") pod \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293744 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcvjj\" (UniqueName: \"kubernetes.io/projected/a45604ea-1cc6-41ef-a08a-5f535a8c79df-kube-api-access-rcvjj\") pod \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293777 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-swiftconf\") pod \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293803 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-dispersionconf\") pod \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293856 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a45604ea-1cc6-41ef-a08a-5f535a8c79df-etc-swift\") pod \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\" (UID: \"a45604ea-1cc6-41ef-a08a-5f535a8c79df\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.293970 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-scripts" (OuterVolumeSpecName: "scripts") pod "a45604ea-1cc6-41ef-a08a-5f535a8c79df" (UID: "a45604ea-1cc6-41ef-a08a-5f535a8c79df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.294080 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.300290 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a45604ea-1cc6-41ef-a08a-5f535a8c79df" (UID: "a45604ea-1cc6-41ef-a08a-5f535a8c79df"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.302275 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a45604ea-1cc6-41ef-a08a-5f535a8c79df" (UID: "a45604ea-1cc6-41ef-a08a-5f535a8c79df"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.306117 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45604ea-1cc6-41ef-a08a-5f535a8c79df-kube-api-access-rcvjj" (OuterVolumeSpecName: "kube-api-access-rcvjj") pod "a45604ea-1cc6-41ef-a08a-5f535a8c79df" (UID: "a45604ea-1cc6-41ef-a08a-5f535a8c79df"). InnerVolumeSpecName "kube-api-access-rcvjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.308964 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45604ea-1cc6-41ef-a08a-5f535a8c79df-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a45604ea-1cc6-41ef-a08a-5f535a8c79df" (UID: "a45604ea-1cc6-41ef-a08a-5f535a8c79df"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.319053 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a45604ea-1cc6-41ef-a08a-5f535a8c79df" (UID: "a45604ea-1cc6-41ef-a08a-5f535a8c79df"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.320935 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a45604ea-1cc6-41ef-a08a-5f535a8c79df" (UID: "a45604ea-1cc6-41ef-a08a-5f535a8c79df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.402714 4725 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a45604ea-1cc6-41ef-a08a-5f535a8c79df-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.402751 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcvjj\" (UniqueName: \"kubernetes.io/projected/a45604ea-1cc6-41ef-a08a-5f535a8c79df-kube-api-access-rcvjj\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.402762 4725 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.402770 4725 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.402778 4725 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a45604ea-1cc6-41ef-a08a-5f535a8c79df-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.402787 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45604ea-1cc6-41ef-a08a-5f535a8c79df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.451142 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jfvkj"] Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.452047 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.507429 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jfvkj"] Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.507514 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-operator-scripts\") pod \"glance-db-create-jfvkj\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.507577 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7z65\" (UniqueName: \"kubernetes.io/projected/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-kube-api-access-m7z65\") pod \"glance-db-create-jfvkj\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.567305 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.570583 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e644-account-create-update-g2stm"] Feb 25 11:11:17 crc kubenswrapper[4725]: E0225 11:11:17.570940 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8094412c-eb55-4366-a7a2-0bd29cff2983" containerName="mariadb-account-create-update" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.570956 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8094412c-eb55-4366-a7a2-0bd29cff2983" containerName="mariadb-account-create-update" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.571123 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8094412c-eb55-4366-a7a2-0bd29cff2983" containerName="mariadb-account-create-update" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.571594 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.576389 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.580818 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e644-account-create-update-g2stm"] Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.628016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8094412c-eb55-4366-a7a2-0bd29cff2983-operator-scripts\") pod \"8094412c-eb55-4366-a7a2-0bd29cff2983\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.628245 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqp6p\" (UniqueName: \"kubernetes.io/projected/8094412c-eb55-4366-a7a2-0bd29cff2983-kube-api-access-gqp6p\") pod \"8094412c-eb55-4366-a7a2-0bd29cff2983\" (UID: \"8094412c-eb55-4366-a7a2-0bd29cff2983\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.628445 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-operator-scripts\") pod \"glance-db-create-jfvkj\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.628489 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpwk9\" (UniqueName: \"kubernetes.io/projected/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-kube-api-access-xpwk9\") pod \"glance-e644-account-create-update-g2stm\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.628515 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-operator-scripts\") pod \"glance-e644-account-create-update-g2stm\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.628564 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7z65\" (UniqueName: \"kubernetes.io/projected/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-kube-api-access-m7z65\") pod \"glance-db-create-jfvkj\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.628595 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:17 crc kubenswrapper[4725]: E0225 11:11:17.628748 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 11:11:17 crc kubenswrapper[4725]: E0225 11:11:17.628766 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 11:11:17 crc kubenswrapper[4725]: E0225 11:11:17.628806 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift podName:d922deba-d455-45a7-ade3-dc2f588617bc nodeName:}" failed. No retries permitted until 2026-02-25 11:11:19.628792383 +0000 UTC m=+1105.127374408 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift") pod "swift-storage-0" (UID: "d922deba-d455-45a7-ade3-dc2f588617bc") : configmap "swift-ring-files" not found Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.629631 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8094412c-eb55-4366-a7a2-0bd29cff2983-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8094412c-eb55-4366-a7a2-0bd29cff2983" (UID: "8094412c-eb55-4366-a7a2-0bd29cff2983"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.629865 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-operator-scripts\") pod \"glance-db-create-jfvkj\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.633936 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8094412c-eb55-4366-a7a2-0bd29cff2983-kube-api-access-gqp6p" (OuterVolumeSpecName: "kube-api-access-gqp6p") pod "8094412c-eb55-4366-a7a2-0bd29cff2983" (UID: "8094412c-eb55-4366-a7a2-0bd29cff2983"). InnerVolumeSpecName "kube-api-access-gqp6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.648259 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7z65\" (UniqueName: \"kubernetes.io/projected/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-kube-api-access-m7z65\") pod \"glance-db-create-jfvkj\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.683749 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.688512 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.698092 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729377 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-operator-scripts\") pod \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729443 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5586\" (UniqueName: \"kubernetes.io/projected/422d7ab0-0190-46dc-976e-e827bb7b48e8-kube-api-access-x5586\") pod \"422d7ab0-0190-46dc-976e-e827bb7b48e8\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729490 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2ef8b2-0d39-411c-b91a-a396aa246f66-operator-scripts\") pod \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729562 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ppzw\" (UniqueName: \"kubernetes.io/projected/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-kube-api-access-7ppzw\") pod \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\" (UID: \"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729603 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbltm\" (UniqueName: \"kubernetes.io/projected/ad2ef8b2-0d39-411c-b91a-a396aa246f66-kube-api-access-xbltm\") pod \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\" (UID: \"ad2ef8b2-0d39-411c-b91a-a396aa246f66\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729634 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/422d7ab0-0190-46dc-976e-e827bb7b48e8-operator-scripts\") pod \"422d7ab0-0190-46dc-976e-e827bb7b48e8\" (UID: \"422d7ab0-0190-46dc-976e-e827bb7b48e8\") " Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729958 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpwk9\" (UniqueName: \"kubernetes.io/projected/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-kube-api-access-xpwk9\") pod \"glance-e644-account-create-update-g2stm\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.729994 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-operator-scripts\") pod \"glance-e644-account-create-update-g2stm\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.730074 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8094412c-eb55-4366-a7a2-0bd29cff2983-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.730086 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqp6p\" (UniqueName: \"kubernetes.io/projected/8094412c-eb55-4366-a7a2-0bd29cff2983-kube-api-access-gqp6p\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.730287 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2ef8b2-0d39-411c-b91a-a396aa246f66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad2ef8b2-0d39-411c-b91a-a396aa246f66" (UID: "ad2ef8b2-0d39-411c-b91a-a396aa246f66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.730665 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-operator-scripts\") pod \"glance-e644-account-create-update-g2stm\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.730745 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09" (UID: "bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.731532 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/422d7ab0-0190-46dc-976e-e827bb7b48e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "422d7ab0-0190-46dc-976e-e827bb7b48e8" (UID: "422d7ab0-0190-46dc-976e-e827bb7b48e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.734371 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422d7ab0-0190-46dc-976e-e827bb7b48e8-kube-api-access-x5586" (OuterVolumeSpecName: "kube-api-access-x5586") pod "422d7ab0-0190-46dc-976e-e827bb7b48e8" (UID: "422d7ab0-0190-46dc-976e-e827bb7b48e8"). InnerVolumeSpecName "kube-api-access-x5586". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.736364 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-kube-api-access-7ppzw" (OuterVolumeSpecName: "kube-api-access-7ppzw") pod "bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09" (UID: "bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09"). InnerVolumeSpecName "kube-api-access-7ppzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.741290 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2ef8b2-0d39-411c-b91a-a396aa246f66-kube-api-access-xbltm" (OuterVolumeSpecName: "kube-api-access-xbltm") pod "ad2ef8b2-0d39-411c-b91a-a396aa246f66" (UID: "ad2ef8b2-0d39-411c-b91a-a396aa246f66"). InnerVolumeSpecName "kube-api-access-xbltm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.748282 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpwk9\" (UniqueName: \"kubernetes.io/projected/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-kube-api-access-xpwk9\") pod \"glance-e644-account-create-update-g2stm\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.830918 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.830953 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5586\" (UniqueName: \"kubernetes.io/projected/422d7ab0-0190-46dc-976e-e827bb7b48e8-kube-api-access-x5586\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.830968 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2ef8b2-0d39-411c-b91a-a396aa246f66-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.830980 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ppzw\" (UniqueName: \"kubernetes.io/projected/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09-kube-api-access-7ppzw\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.830991 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbltm\" (UniqueName: \"kubernetes.io/projected/ad2ef8b2-0d39-411c-b91a-a396aa246f66-kube-api-access-xbltm\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.831006 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/422d7ab0-0190-46dc-976e-e827bb7b48e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.867306 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:17 crc kubenswrapper[4725]: I0225 11:11:17.889388 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.103319 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zc6sk" event={"ID":"c5574881-8546-456a-96b2-d58158e8a447","Type":"ContainerStarted","Data":"0a91002c464dd60b247c0190b87df27576dece593f38aec1787b9cb997cc09d1"} Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.105449 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bb4jr" event={"ID":"ad2ef8b2-0d39-411c-b91a-a396aa246f66","Type":"ContainerDied","Data":"e9f278e086e5bd35614a484e84331dbae5e95b24a6fa33cf1854e2b3ede4466b"} Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.105492 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9f278e086e5bd35614a484e84331dbae5e95b24a6fa33cf1854e2b3ede4466b" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.105556 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bb4jr" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.117212 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kzkj5" event={"ID":"bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09","Type":"ContainerDied","Data":"7afcf9bf09a30ca262d30af044b31fe6381665ef566d1ec185af3104500cead9"} Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.117250 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7afcf9bf09a30ca262d30af044b31fe6381665ef566d1ec185af3104500cead9" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.117306 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kzkj5" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.136611 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4cc8-account-create-update-gxqmd" event={"ID":"8094412c-eb55-4366-a7a2-0bd29cff2983","Type":"ContainerDied","Data":"2bca8d5148d75e3894cb9d00bba51ecd5751038adbf410cba0c2213853fe63f8"} Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.136661 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bca8d5148d75e3894cb9d00bba51ecd5751038adbf410cba0c2213853fe63f8" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.136740 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4cc8-account-create-update-gxqmd" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.140440 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdca-account-create-update-998xh" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.141313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdca-account-create-update-998xh" event={"ID":"422d7ab0-0190-46dc-976e-e827bb7b48e8","Type":"ContainerDied","Data":"9f2c6fa465b059ce9f1437d2b9b3b40585cb7823f8fe19510f4b9c82bbc4e58e"} Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.141347 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f2c6fa465b059ce9f1437d2b9b3b40585cb7823f8fe19510f4b9c82bbc4e58e" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.141414 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mzr6j" Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.264648 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-mzr6j"] Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.273247 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-mzr6j"] Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.378618 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jfvkj"] Feb 25 11:11:18 crc kubenswrapper[4725]: W0225 11:11:18.397020 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c835a1_6f18_44d6_a4ce_669692e0e6d9.slice/crio-8ea46b17d82ca126cebe624dbdc1a263bc6e366ed8e2291059e648d867189931 WatchSource:0}: Error finding container 8ea46b17d82ca126cebe624dbdc1a263bc6e366ed8e2291059e648d867189931: Status 404 returned error can't find the container with id 8ea46b17d82ca126cebe624dbdc1a263bc6e366ed8e2291059e648d867189931 Feb 25 11:11:18 crc kubenswrapper[4725]: I0225 11:11:18.507209 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e644-account-create-update-g2stm"] Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.149493 4725 generic.go:334] "Generic (PLEG): container finished" podID="43468cb6-ecc1-44f0-b5f0-7de8f76cc465" containerID="eebb3933bcc50618730ab1a0fa945eb463582eb4b6ca24b3dd338def8b68d79b" exitCode=0 Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.149799 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e644-account-create-update-g2stm" event={"ID":"43468cb6-ecc1-44f0-b5f0-7de8f76cc465","Type":"ContainerDied","Data":"eebb3933bcc50618730ab1a0fa945eb463582eb4b6ca24b3dd338def8b68d79b"} Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.149869 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e644-account-create-update-g2stm" event={"ID":"43468cb6-ecc1-44f0-b5f0-7de8f76cc465","Type":"ContainerStarted","Data":"f5b28f88ce8647243de958c01493ac969bca50a9bae29d88fdb2477e7dc79b5d"} Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.155348 4725 generic.go:334] "Generic (PLEG): container finished" podID="e7c835a1-6f18-44d6-a4ce-669692e0e6d9" containerID="4e542341c3e93604d629b01baf6f5217bb7546587b44eabe4c3a16ae31a4e166" exitCode=0 Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.155482 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jfvkj" event={"ID":"e7c835a1-6f18-44d6-a4ce-669692e0e6d9","Type":"ContainerDied","Data":"4e542341c3e93604d629b01baf6f5217bb7546587b44eabe4c3a16ae31a4e166"} Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.155574 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jfvkj" event={"ID":"e7c835a1-6f18-44d6-a4ce-669692e0e6d9","Type":"ContainerStarted","Data":"8ea46b17d82ca126cebe624dbdc1a263bc6e366ed8e2291059e648d867189931"} Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.245132 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45604ea-1cc6-41ef-a08a-5f535a8c79df" path="/var/lib/kubelet/pods/a45604ea-1cc6-41ef-a08a-5f535a8c79df/volumes" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.285999 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5bfb4"] Feb 25 11:11:19 crc kubenswrapper[4725]: E0225 11:11:19.287023 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2ef8b2-0d39-411c-b91a-a396aa246f66" containerName="mariadb-database-create" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.289150 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2ef8b2-0d39-411c-b91a-a396aa246f66" containerName="mariadb-database-create" Feb 25 11:11:19 crc kubenswrapper[4725]: E0225 11:11:19.289265 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422d7ab0-0190-46dc-976e-e827bb7b48e8" containerName="mariadb-account-create-update" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.289353 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="422d7ab0-0190-46dc-976e-e827bb7b48e8" containerName="mariadb-account-create-update" Feb 25 11:11:19 crc kubenswrapper[4725]: E0225 11:11:19.289450 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09" containerName="mariadb-database-create" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.289521 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09" containerName="mariadb-database-create" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.289895 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09" containerName="mariadb-database-create" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.289974 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="422d7ab0-0190-46dc-976e-e827bb7b48e8" containerName="mariadb-account-create-update" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.290047 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2ef8b2-0d39-411c-b91a-a396aa246f66" containerName="mariadb-database-create" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.290768 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.299016 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.300339 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5bfb4"] Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.361274 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8zpx\" (UniqueName: \"kubernetes.io/projected/3e506a50-7f55-435f-9a7e-55e75a1edca8-kube-api-access-v8zpx\") pod \"root-account-create-update-5bfb4\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.361329 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e506a50-7f55-435f-9a7e-55e75a1edca8-operator-scripts\") pod \"root-account-create-update-5bfb4\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.463729 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8zpx\" (UniqueName: \"kubernetes.io/projected/3e506a50-7f55-435f-9a7e-55e75a1edca8-kube-api-access-v8zpx\") pod \"root-account-create-update-5bfb4\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.464751 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e506a50-7f55-435f-9a7e-55e75a1edca8-operator-scripts\") pod \"root-account-create-update-5bfb4\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.465719 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e506a50-7f55-435f-9a7e-55e75a1edca8-operator-scripts\") pod \"root-account-create-update-5bfb4\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.488429 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8zpx\" (UniqueName: \"kubernetes.io/projected/3e506a50-7f55-435f-9a7e-55e75a1edca8-kube-api-access-v8zpx\") pod \"root-account-create-update-5bfb4\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.622788 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:19 crc kubenswrapper[4725]: I0225 11:11:19.668383 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:19 crc kubenswrapper[4725]: E0225 11:11:19.668637 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 11:11:19 crc kubenswrapper[4725]: E0225 11:11:19.668657 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 11:11:19 crc kubenswrapper[4725]: E0225 11:11:19.668700 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift podName:d922deba-d455-45a7-ade3-dc2f588617bc nodeName:}" failed. No retries permitted until 2026-02-25 11:11:23.668685056 +0000 UTC m=+1109.167267081 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift") pod "swift-storage-0" (UID: "d922deba-d455-45a7-ade3-dc2f588617bc") : configmap "swift-ring-files" not found Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.513318 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.522408 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.639597 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-operator-scripts\") pod \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.639646 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-operator-scripts\") pod \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.639692 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpwk9\" (UniqueName: \"kubernetes.io/projected/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-kube-api-access-xpwk9\") pod \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\" (UID: \"43468cb6-ecc1-44f0-b5f0-7de8f76cc465\") " Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.639727 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7z65\" (UniqueName: \"kubernetes.io/projected/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-kube-api-access-m7z65\") pod \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\" (UID: \"e7c835a1-6f18-44d6-a4ce-669692e0e6d9\") " Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.640491 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7c835a1-6f18-44d6-a4ce-669692e0e6d9" (UID: "e7c835a1-6f18-44d6-a4ce-669692e0e6d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.640539 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43468cb6-ecc1-44f0-b5f0-7de8f76cc465" (UID: "43468cb6-ecc1-44f0-b5f0-7de8f76cc465"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.645707 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-kube-api-access-xpwk9" (OuterVolumeSpecName: "kube-api-access-xpwk9") pod "43468cb6-ecc1-44f0-b5f0-7de8f76cc465" (UID: "43468cb6-ecc1-44f0-b5f0-7de8f76cc465"). InnerVolumeSpecName "kube-api-access-xpwk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.648907 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-kube-api-access-m7z65" (OuterVolumeSpecName: "kube-api-access-m7z65") pod "e7c835a1-6f18-44d6-a4ce-669692e0e6d9" (UID: "e7c835a1-6f18-44d6-a4ce-669692e0e6d9"). InnerVolumeSpecName "kube-api-access-m7z65". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.741957 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.741990 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.741999 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpwk9\" (UniqueName: \"kubernetes.io/projected/43468cb6-ecc1-44f0-b5f0-7de8f76cc465-kube-api-access-xpwk9\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.742017 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7z65\" (UniqueName: \"kubernetes.io/projected/e7c835a1-6f18-44d6-a4ce-669692e0e6d9-kube-api-access-m7z65\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:21 crc kubenswrapper[4725]: I0225 11:11:21.858942 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5bfb4"] Feb 25 11:11:22 crc kubenswrapper[4725]: I0225 11:11:22.183570 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e644-account-create-update-g2stm" Feb 25 11:11:22 crc kubenswrapper[4725]: I0225 11:11:22.183555 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e644-account-create-update-g2stm" event={"ID":"43468cb6-ecc1-44f0-b5f0-7de8f76cc465","Type":"ContainerDied","Data":"f5b28f88ce8647243de958c01493ac969bca50a9bae29d88fdb2477e7dc79b5d"} Feb 25 11:11:22 crc kubenswrapper[4725]: I0225 11:11:22.184126 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b28f88ce8647243de958c01493ac969bca50a9bae29d88fdb2477e7dc79b5d" Feb 25 11:11:22 crc kubenswrapper[4725]: I0225 11:11:22.185471 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jfvkj" Feb 25 11:11:22 crc kubenswrapper[4725]: I0225 11:11:22.185460 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jfvkj" event={"ID":"e7c835a1-6f18-44d6-a4ce-669692e0e6d9","Type":"ContainerDied","Data":"8ea46b17d82ca126cebe624dbdc1a263bc6e366ed8e2291059e648d867189931"} Feb 25 11:11:22 crc kubenswrapper[4725]: I0225 11:11:22.185629 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ea46b17d82ca126cebe624dbdc1a263bc6e366ed8e2291059e648d867189931" Feb 25 11:11:22 crc kubenswrapper[4725]: I0225 11:11:22.186600 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5bfb4" event={"ID":"3e506a50-7f55-435f-9a7e-55e75a1edca8","Type":"ContainerStarted","Data":"b550e3cb9dfb6f2f2e65158f41eab70b5db8563f374bbd23f92fb56488afa0c5"} Feb 25 11:11:23 crc kubenswrapper[4725]: I0225 11:11:23.200524 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5bfb4" event={"ID":"3e506a50-7f55-435f-9a7e-55e75a1edca8","Type":"ContainerStarted","Data":"8c2a5a7f2c12e174eb0d21615a7ac8d84e21b646284bb0b0a912c6ca48a88f8f"} Feb 25 11:11:23 crc kubenswrapper[4725]: I0225 11:11:23.684203 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:23 crc kubenswrapper[4725]: E0225 11:11:23.684433 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 11:11:23 crc kubenswrapper[4725]: E0225 11:11:23.684572 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 11:11:23 crc kubenswrapper[4725]: E0225 11:11:23.684622 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift podName:d922deba-d455-45a7-ade3-dc2f588617bc nodeName:}" failed. No retries permitted until 2026-02-25 11:11:31.684606003 +0000 UTC m=+1117.183188028 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift") pod "swift-storage-0" (UID: "d922deba-d455-45a7-ade3-dc2f588617bc") : configmap "swift-ring-files" not found Feb 25 11:11:24 crc kubenswrapper[4725]: I0225 11:11:24.240420 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zc6sk" event={"ID":"c5574881-8546-456a-96b2-d58158e8a447","Type":"ContainerStarted","Data":"511144eda0aa00ee182a064844dc975be5308246edd8d270ab0c5fef1a8197d3"} Feb 25 11:11:24 crc kubenswrapper[4725]: I0225 11:11:24.244890 4725 generic.go:334] "Generic (PLEG): container finished" podID="3e506a50-7f55-435f-9a7e-55e75a1edca8" containerID="8c2a5a7f2c12e174eb0d21615a7ac8d84e21b646284bb0b0a912c6ca48a88f8f" exitCode=0 Feb 25 11:11:24 crc kubenswrapper[4725]: I0225 11:11:24.244995 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5bfb4" event={"ID":"3e506a50-7f55-435f-9a7e-55e75a1edca8","Type":"ContainerDied","Data":"8c2a5a7f2c12e174eb0d21615a7ac8d84e21b646284bb0b0a912c6ca48a88f8f"} Feb 25 11:11:24 crc kubenswrapper[4725]: I0225 11:11:24.264264 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-5bfb4" podStartSLOduration=5.264246641 podStartE2EDuration="5.264246641s" podCreationTimestamp="2026-02-25 11:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:23.225041111 +0000 UTC m=+1108.723623176" watchObservedRunningTime="2026-02-25 11:11:24.264246641 +0000 UTC m=+1109.762828666" Feb 25 11:11:24 crc kubenswrapper[4725]: I0225 11:11:24.264585 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-zc6sk" podStartSLOduration=2.257034456 podStartE2EDuration="8.26458131s" podCreationTimestamp="2026-02-25 11:11:16 +0000 UTC" firstStartedPulling="2026-02-25 11:11:17.14084572 +0000 UTC m=+1102.639427745" lastFinishedPulling="2026-02-25 11:11:23.148392534 +0000 UTC m=+1108.646974599" observedRunningTime="2026-02-25 11:11:24.259873585 +0000 UTC m=+1109.758455650" watchObservedRunningTime="2026-02-25 11:11:24.26458131 +0000 UTC m=+1109.763163335" Feb 25 11:11:24 crc kubenswrapper[4725]: I0225 11:11:24.944086 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.081312 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xrrkb"] Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.081527 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" podUID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerName="dnsmasq-dns" containerID="cri-o://46dfedb0b8c045a8303726a41537738d3fa9e1d483dd0b7273834663cc71e3d5" gracePeriod=10 Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.253460 4725 generic.go:334] "Generic (PLEG): container finished" podID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerID="46dfedb0b8c045a8303726a41537738d3fa9e1d483dd0b7273834663cc71e3d5" exitCode=0 Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.253614 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" event={"ID":"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3","Type":"ContainerDied","Data":"46dfedb0b8c045a8303726a41537738d3fa9e1d483dd0b7273834663cc71e3d5"} Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.654264 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.658692 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.752758 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxq6\" (UniqueName: \"kubernetes.io/projected/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-kube-api-access-mtxq6\") pod \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.752816 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-dns-svc\") pod \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.752939 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-config\") pod \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\" (UID: \"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3\") " Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.753000 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8zpx\" (UniqueName: \"kubernetes.io/projected/3e506a50-7f55-435f-9a7e-55e75a1edca8-kube-api-access-v8zpx\") pod \"3e506a50-7f55-435f-9a7e-55e75a1edca8\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.753032 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e506a50-7f55-435f-9a7e-55e75a1edca8-operator-scripts\") pod \"3e506a50-7f55-435f-9a7e-55e75a1edca8\" (UID: \"3e506a50-7f55-435f-9a7e-55e75a1edca8\") " Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.753654 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e506a50-7f55-435f-9a7e-55e75a1edca8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e506a50-7f55-435f-9a7e-55e75a1edca8" (UID: "3e506a50-7f55-435f-9a7e-55e75a1edca8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.786355 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e506a50-7f55-435f-9a7e-55e75a1edca8-kube-api-access-v8zpx" (OuterVolumeSpecName: "kube-api-access-v8zpx") pod "3e506a50-7f55-435f-9a7e-55e75a1edca8" (UID: "3e506a50-7f55-435f-9a7e-55e75a1edca8"). InnerVolumeSpecName "kube-api-access-v8zpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.816352 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-kube-api-access-mtxq6" (OuterVolumeSpecName: "kube-api-access-mtxq6") pod "4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" (UID: "4efcc1fc-3f0d-42c6-81bc-b9b5797279a3"). InnerVolumeSpecName "kube-api-access-mtxq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.854287 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8zpx\" (UniqueName: \"kubernetes.io/projected/3e506a50-7f55-435f-9a7e-55e75a1edca8-kube-api-access-v8zpx\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.854313 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e506a50-7f55-435f-9a7e-55e75a1edca8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.854324 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtxq6\" (UniqueName: \"kubernetes.io/projected/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-kube-api-access-mtxq6\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.854492 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-config" (OuterVolumeSpecName: "config") pod "4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" (UID: "4efcc1fc-3f0d-42c6-81bc-b9b5797279a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.863876 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" (UID: "4efcc1fc-3f0d-42c6-81bc-b9b5797279a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.956076 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:25 crc kubenswrapper[4725]: I0225 11:11:25.956288 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.261065 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5bfb4" event={"ID":"3e506a50-7f55-435f-9a7e-55e75a1edca8","Type":"ContainerDied","Data":"b550e3cb9dfb6f2f2e65158f41eab70b5db8563f374bbd23f92fb56488afa0c5"} Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.261102 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b550e3cb9dfb6f2f2e65158f41eab70b5db8563f374bbd23f92fb56488afa0c5" Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.261150 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5bfb4" Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.264188 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" event={"ID":"4efcc1fc-3f0d-42c6-81bc-b9b5797279a3","Type":"ContainerDied","Data":"58621bd4fd3545459bb904f533102681fae8519b1d5f06f44114632c0f627f26"} Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.264220 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xrrkb" Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.264240 4725 scope.go:117] "RemoveContainer" containerID="46dfedb0b8c045a8303726a41537738d3fa9e1d483dd0b7273834663cc71e3d5" Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.283955 4725 scope.go:117] "RemoveContainer" containerID="116fbcc0df11fbf02a24e3e8fc933cd7c66f193e71c982b54d937e64f57d4849" Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.294773 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xrrkb"] Feb 25 11:11:26 crc kubenswrapper[4725]: I0225 11:11:26.301005 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xrrkb"] Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.239288 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" path="/var/lib/kubelet/pods/4efcc1fc-3f0d-42c6-81bc-b9b5797279a3/volumes" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.697384 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jtvbl"] Feb 25 11:11:27 crc kubenswrapper[4725]: E0225 11:11:27.698080 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e506a50-7f55-435f-9a7e-55e75a1edca8" containerName="mariadb-account-create-update" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698096 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e506a50-7f55-435f-9a7e-55e75a1edca8" containerName="mariadb-account-create-update" Feb 25 11:11:27 crc kubenswrapper[4725]: E0225 11:11:27.698112 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerName="dnsmasq-dns" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698122 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerName="dnsmasq-dns" Feb 25 11:11:27 crc kubenswrapper[4725]: E0225 11:11:27.698140 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43468cb6-ecc1-44f0-b5f0-7de8f76cc465" containerName="mariadb-account-create-update" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698148 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="43468cb6-ecc1-44f0-b5f0-7de8f76cc465" containerName="mariadb-account-create-update" Feb 25 11:11:27 crc kubenswrapper[4725]: E0225 11:11:27.698165 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerName="init" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698172 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerName="init" Feb 25 11:11:27 crc kubenswrapper[4725]: E0225 11:11:27.698183 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c835a1-6f18-44d6-a4ce-669692e0e6d9" containerName="mariadb-database-create" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698191 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c835a1-6f18-44d6-a4ce-669692e0e6d9" containerName="mariadb-database-create" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698387 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="43468cb6-ecc1-44f0-b5f0-7de8f76cc465" containerName="mariadb-account-create-update" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698402 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e506a50-7f55-435f-9a7e-55e75a1edca8" containerName="mariadb-account-create-update" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698422 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efcc1fc-3f0d-42c6-81bc-b9b5797279a3" containerName="dnsmasq-dns" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.698435 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c835a1-6f18-44d6-a4ce-669692e0e6d9" containerName="mariadb-database-create" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.699001 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.702047 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.703033 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sz5r7" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.750124 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jtvbl"] Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.803352 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-config-data\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.803404 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-db-sync-config-data\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.803427 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-combined-ca-bundle\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.803498 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kd5w\" (UniqueName: \"kubernetes.io/projected/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-kube-api-access-5kd5w\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.905424 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kd5w\" (UniqueName: \"kubernetes.io/projected/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-kube-api-access-5kd5w\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.905576 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-config-data\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.905636 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-db-sync-config-data\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.905715 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-combined-ca-bundle\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.930937 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-db-sync-config-data\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.931015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-config-data\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.934165 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-combined-ca-bundle\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:27 crc kubenswrapper[4725]: I0225 11:11:27.937140 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kd5w\" (UniqueName: \"kubernetes.io/projected/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-kube-api-access-5kd5w\") pod \"glance-db-sync-jtvbl\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:28 crc kubenswrapper[4725]: I0225 11:11:28.028707 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:28 crc kubenswrapper[4725]: I0225 11:11:28.566649 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jtvbl"] Feb 25 11:11:28 crc kubenswrapper[4725]: W0225 11:11:28.576053 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88fc0f62_6868_40e9_a04e_5de23ca3e5fe.slice/crio-32f91a7f72a8591c7caeaa56b397706991b4f19687be28e10b50a4b136521e91 WatchSource:0}: Error finding container 32f91a7f72a8591c7caeaa56b397706991b4f19687be28e10b50a4b136521e91: Status 404 returned error can't find the container with id 32f91a7f72a8591c7caeaa56b397706991b4f19687be28e10b50a4b136521e91 Feb 25 11:11:28 crc kubenswrapper[4725]: I0225 11:11:28.742521 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 25 11:11:29 crc kubenswrapper[4725]: I0225 11:11:29.301683 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jtvbl" event={"ID":"88fc0f62-6868-40e9-a04e-5de23ca3e5fe","Type":"ContainerStarted","Data":"32f91a7f72a8591c7caeaa56b397706991b4f19687be28e10b50a4b136521e91"} Feb 25 11:11:30 crc kubenswrapper[4725]: I0225 11:11:30.581031 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5bfb4"] Feb 25 11:11:30 crc kubenswrapper[4725]: I0225 11:11:30.587501 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5bfb4"] Feb 25 11:11:31 crc kubenswrapper[4725]: I0225 11:11:31.233050 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e506a50-7f55-435f-9a7e-55e75a1edca8" path="/var/lib/kubelet/pods/3e506a50-7f55-435f-9a7e-55e75a1edca8/volumes" Feb 25 11:11:31 crc kubenswrapper[4725]: I0225 11:11:31.777221 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:31 crc kubenswrapper[4725]: I0225 11:11:31.799635 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d922deba-d455-45a7-ade3-dc2f588617bc-etc-swift\") pod \"swift-storage-0\" (UID: \"d922deba-d455-45a7-ade3-dc2f588617bc\") " pod="openstack/swift-storage-0" Feb 25 11:11:32 crc kubenswrapper[4725]: I0225 11:11:32.051080 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 25 11:11:32 crc kubenswrapper[4725]: I0225 11:11:32.331585 4725 generic.go:334] "Generic (PLEG): container finished" podID="c5574881-8546-456a-96b2-d58158e8a447" containerID="511144eda0aa00ee182a064844dc975be5308246edd8d270ab0c5fef1a8197d3" exitCode=0 Feb 25 11:11:32 crc kubenswrapper[4725]: I0225 11:11:32.331630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zc6sk" event={"ID":"c5574881-8546-456a-96b2-d58158e8a447","Type":"ContainerDied","Data":"511144eda0aa00ee182a064844dc975be5308246edd8d270ab0c5fef1a8197d3"} Feb 25 11:11:32 crc kubenswrapper[4725]: I0225 11:11:32.598943 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 25 11:11:33 crc kubenswrapper[4725]: I0225 11:11:33.220454 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xpvnr" podUID="d2445fb4-75ca-4ea2-b979-5757105279ab" containerName="ovn-controller" probeResult="failure" output=< Feb 25 11:11:33 crc kubenswrapper[4725]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 25 11:11:33 crc kubenswrapper[4725]: > Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.360651 4725 generic.go:334] "Generic (PLEG): container finished" podID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerID="65bb35575781bad2e98c04d4e1b97efb65e9db76bd69365abd39ee6385396cf2" exitCode=0 Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.361033 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57e7a103-f119-4d8e-bb7f-96f36b66994e","Type":"ContainerDied","Data":"65bb35575781bad2e98c04d4e1b97efb65e9db76bd69365abd39ee6385396cf2"} Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.363227 4725 generic.go:334] "Generic (PLEG): container finished" podID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerID="6d69b6d7376a54b89e12188a0e9f6681be6c795c0ea23114c746f56b5175501a" exitCode=0 Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.363270 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d1a511fd-4696-456a-8263-da4cd2f5eff1","Type":"ContainerDied","Data":"6d69b6d7376a54b89e12188a0e9f6681be6c795c0ea23114c746f56b5175501a"} Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.614240 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zf9sp"] Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.616045 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.621211 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.646865 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zf9sp"] Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.674690 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6784bd0f-0863-4990-bc78-c04561fbd465-operator-scripts\") pod \"root-account-create-update-zf9sp\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.675011 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgjmx\" (UniqueName: \"kubernetes.io/projected/6784bd0f-0863-4990-bc78-c04561fbd465-kube-api-access-cgjmx\") pod \"root-account-create-update-zf9sp\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.775736 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgjmx\" (UniqueName: \"kubernetes.io/projected/6784bd0f-0863-4990-bc78-c04561fbd465-kube-api-access-cgjmx\") pod \"root-account-create-update-zf9sp\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.775783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6784bd0f-0863-4990-bc78-c04561fbd465-operator-scripts\") pod \"root-account-create-update-zf9sp\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.776504 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6784bd0f-0863-4990-bc78-c04561fbd465-operator-scripts\") pod \"root-account-create-update-zf9sp\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.804931 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgjmx\" (UniqueName: \"kubernetes.io/projected/6784bd0f-0863-4990-bc78-c04561fbd465-kube-api-access-cgjmx\") pod \"root-account-create-update-zf9sp\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:35 crc kubenswrapper[4725]: I0225 11:11:35.943638 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.221971 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xpvnr" podUID="d2445fb4-75ca-4ea2-b979-5757105279ab" containerName="ovn-controller" probeResult="failure" output=< Feb 25 11:11:38 crc kubenswrapper[4725]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 25 11:11:38 crc kubenswrapper[4725]: > Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.262907 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.294906 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-drphb" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.518996 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xpvnr-config-qzwx5"] Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.520189 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.523366 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.527473 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xpvnr-config-qzwx5"] Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.628374 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-log-ovn\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.628781 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtdlv\" (UniqueName: \"kubernetes.io/projected/e05bc7f5-efe6-462b-91e6-07f5113858a9-kube-api-access-xtdlv\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.628927 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.629045 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run-ovn\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.629130 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-scripts\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.629216 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-additional-scripts\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.731265 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.731333 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run-ovn\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.731352 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-scripts\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.731388 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-additional-scripts\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.731440 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-log-ovn\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.731489 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtdlv\" (UniqueName: \"kubernetes.io/projected/e05bc7f5-efe6-462b-91e6-07f5113858a9-kube-api-access-xtdlv\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.732005 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run-ovn\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.732038 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-log-ovn\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.732092 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.732738 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-additional-scripts\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.734684 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-scripts\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.772542 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtdlv\" (UniqueName: \"kubernetes.io/projected/e05bc7f5-efe6-462b-91e6-07f5113858a9-kube-api-access-xtdlv\") pod \"ovn-controller-xpvnr-config-qzwx5\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:38 crc kubenswrapper[4725]: I0225 11:11:38.866317 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.669901 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.755382 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-combined-ca-bundle\") pod \"c5574881-8546-456a-96b2-d58158e8a447\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.755418 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-ring-data-devices\") pod \"c5574881-8546-456a-96b2-d58158e8a447\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.755580 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-scripts\") pod \"c5574881-8546-456a-96b2-d58158e8a447\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.755618 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-dispersionconf\") pod \"c5574881-8546-456a-96b2-d58158e8a447\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.755648 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-swiftconf\") pod \"c5574881-8546-456a-96b2-d58158e8a447\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.755666 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5574881-8546-456a-96b2-d58158e8a447-etc-swift\") pod \"c5574881-8546-456a-96b2-d58158e8a447\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.755704 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zfm8\" (UniqueName: \"kubernetes.io/projected/c5574881-8546-456a-96b2-d58158e8a447-kube-api-access-5zfm8\") pod \"c5574881-8546-456a-96b2-d58158e8a447\" (UID: \"c5574881-8546-456a-96b2-d58158e8a447\") " Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.759092 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c5574881-8546-456a-96b2-d58158e8a447" (UID: "c5574881-8546-456a-96b2-d58158e8a447"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.760286 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5574881-8546-456a-96b2-d58158e8a447-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c5574881-8546-456a-96b2-d58158e8a447" (UID: "c5574881-8546-456a-96b2-d58158e8a447"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.763213 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5574881-8546-456a-96b2-d58158e8a447-kube-api-access-5zfm8" (OuterVolumeSpecName: "kube-api-access-5zfm8") pod "c5574881-8546-456a-96b2-d58158e8a447" (UID: "c5574881-8546-456a-96b2-d58158e8a447"). InnerVolumeSpecName "kube-api-access-5zfm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.784393 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c5574881-8546-456a-96b2-d58158e8a447" (UID: "c5574881-8546-456a-96b2-d58158e8a447"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.807813 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-scripts" (OuterVolumeSpecName: "scripts") pod "c5574881-8546-456a-96b2-d58158e8a447" (UID: "c5574881-8546-456a-96b2-d58158e8a447"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.829378 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c5574881-8546-456a-96b2-d58158e8a447" (UID: "c5574881-8546-456a-96b2-d58158e8a447"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.844204 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5574881-8546-456a-96b2-d58158e8a447" (UID: "c5574881-8546-456a-96b2-d58158e8a447"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.858507 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.858544 4725 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.858558 4725 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.858570 4725 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5574881-8546-456a-96b2-d58158e8a447-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.858583 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zfm8\" (UniqueName: \"kubernetes.io/projected/c5574881-8546-456a-96b2-d58158e8a447-kube-api-access-5zfm8\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.858596 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5574881-8546-456a-96b2-d58158e8a447-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:39 crc kubenswrapper[4725]: I0225 11:11:39.858607 4725 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5574881-8546-456a-96b2-d58158e8a447-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.125327 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zf9sp"] Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.251862 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xpvnr-config-qzwx5"] Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.414083 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zc6sk" event={"ID":"c5574881-8546-456a-96b2-d58158e8a447","Type":"ContainerDied","Data":"0a91002c464dd60b247c0190b87df27576dece593f38aec1787b9cb997cc09d1"} Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.414247 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a91002c464dd60b247c0190b87df27576dece593f38aec1787b9cb997cc09d1" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.414226 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zc6sk" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.416111 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zf9sp" event={"ID":"6784bd0f-0863-4990-bc78-c04561fbd465","Type":"ContainerStarted","Data":"cec057bfffb3bb2d8ca364f7a56b71c477134ce0ec09829a5ce7ddee17499e07"} Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.416133 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zf9sp" event={"ID":"6784bd0f-0863-4990-bc78-c04561fbd465","Type":"ContainerStarted","Data":"ed65cf7db228124520de1bf5d6742c214d3a950a62bba7042e37f4b761ebc092"} Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.419159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"e9cf4888fd58c64ac3fd719c043667f18ae8aacedc68872b89f76fa301a9fefa"} Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.423233 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d1a511fd-4696-456a-8263-da4cd2f5eff1","Type":"ContainerStarted","Data":"ba1270fb11896d23a9d4c55ad713140436475dae01dfee53fe8721ec435833ea"} Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.423706 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.426038 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jtvbl" event={"ID":"88fc0f62-6868-40e9-a04e-5de23ca3e5fe","Type":"ContainerStarted","Data":"3c5dcef313b89405ca14bcd4691e17176c33f08f540c16811345ad7cd839f737"} Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.430845 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57e7a103-f119-4d8e-bb7f-96f36b66994e","Type":"ContainerStarted","Data":"68acb62c236cce60fe0e6b8ce02f29b116f03427200c51be4c7cdd38ee606404"} Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.431129 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.435671 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-zf9sp" podStartSLOduration=5.435652667 podStartE2EDuration="5.435652667s" podCreationTimestamp="2026-02-25 11:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:40.427680447 +0000 UTC m=+1125.926262472" watchObservedRunningTime="2026-02-25 11:11:40.435652667 +0000 UTC m=+1125.934234692" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.448368 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.51287462 podStartE2EDuration="1m3.448347393s" podCreationTimestamp="2026-02-25 11:10:37 +0000 UTC" firstStartedPulling="2026-02-25 11:10:53.125787744 +0000 UTC m=+1078.624369769" lastFinishedPulling="2026-02-25 11:11:01.061260517 +0000 UTC m=+1086.559842542" observedRunningTime="2026-02-25 11:11:40.446540955 +0000 UTC m=+1125.945122980" watchObservedRunningTime="2026-02-25 11:11:40.448347393 +0000 UTC m=+1125.946929438" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.472522 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=55.113700477 podStartE2EDuration="1m3.472508292s" podCreationTimestamp="2026-02-25 11:10:37 +0000 UTC" firstStartedPulling="2026-02-25 11:10:52.788857122 +0000 UTC m=+1078.287439157" lastFinishedPulling="2026-02-25 11:11:01.147664937 +0000 UTC m=+1086.646246972" observedRunningTime="2026-02-25 11:11:40.465870816 +0000 UTC m=+1125.964452841" watchObservedRunningTime="2026-02-25 11:11:40.472508292 +0000 UTC m=+1125.971090317" Feb 25 11:11:40 crc kubenswrapper[4725]: I0225 11:11:40.492527 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jtvbl" podStartSLOduration=2.421839888 podStartE2EDuration="13.492511691s" podCreationTimestamp="2026-02-25 11:11:27 +0000 UTC" firstStartedPulling="2026-02-25 11:11:28.578133727 +0000 UTC m=+1114.076715762" lastFinishedPulling="2026-02-25 11:11:39.64880553 +0000 UTC m=+1125.147387565" observedRunningTime="2026-02-25 11:11:40.491875174 +0000 UTC m=+1125.990457209" watchObservedRunningTime="2026-02-25 11:11:40.492511691 +0000 UTC m=+1125.991093716" Feb 25 11:11:40 crc kubenswrapper[4725]: W0225 11:11:40.514004 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode05bc7f5_efe6_462b_91e6_07f5113858a9.slice/crio-84bd527a339874f51458ec9df62b680f178db5c68c149d9e9b697f0cf81f799a WatchSource:0}: Error finding container 84bd527a339874f51458ec9df62b680f178db5c68c149d9e9b697f0cf81f799a: Status 404 returned error can't find the container with id 84bd527a339874f51458ec9df62b680f178db5c68c149d9e9b697f0cf81f799a Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.449940 4725 generic.go:334] "Generic (PLEG): container finished" podID="e05bc7f5-efe6-462b-91e6-07f5113858a9" containerID="cc458f72993980725388a3b3f0c97c2fe01765ceeda69dec8ffe26f437197b33" exitCode=0 Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.450066 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xpvnr-config-qzwx5" event={"ID":"e05bc7f5-efe6-462b-91e6-07f5113858a9","Type":"ContainerDied","Data":"cc458f72993980725388a3b3f0c97c2fe01765ceeda69dec8ffe26f437197b33"} Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.450476 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xpvnr-config-qzwx5" event={"ID":"e05bc7f5-efe6-462b-91e6-07f5113858a9","Type":"ContainerStarted","Data":"84bd527a339874f51458ec9df62b680f178db5c68c149d9e9b697f0cf81f799a"} Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.452799 4725 generic.go:334] "Generic (PLEG): container finished" podID="6784bd0f-0863-4990-bc78-c04561fbd465" containerID="cec057bfffb3bb2d8ca364f7a56b71c477134ce0ec09829a5ce7ddee17499e07" exitCode=0 Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.452875 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zf9sp" event={"ID":"6784bd0f-0863-4990-bc78-c04561fbd465","Type":"ContainerDied","Data":"cec057bfffb3bb2d8ca364f7a56b71c477134ce0ec09829a5ce7ddee17499e07"} Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.458408 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"496dad572519ab10526e660bcbfde473b6c582366739a5035e414218a4afbef5"} Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.458459 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"5001e4621f258447b538274c92455a115d86cb94732c727c8d892aec21e94b4e"} Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.458472 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"ef2d107df0158a83d258bf6b6a4e5d1ca480096d4c417a59b18563f7c8d84e53"} Feb 25 11:11:41 crc kubenswrapper[4725]: I0225 11:11:41.458482 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"6bdd63ae2c099ff3d5774b010adf6019ea6258fa4284a8243de53b7e0e89b99c"} Feb 25 11:11:42 crc kubenswrapper[4725]: I0225 11:11:42.478315 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"c19c1bcfe96408b3d9e1f120bcf298ce7b5b0d12467c5440c88dc7da83b5404b"} Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.008368 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.017473 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.117728 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-additional-scripts\") pod \"e05bc7f5-efe6-462b-91e6-07f5113858a9\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.117896 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtdlv\" (UniqueName: \"kubernetes.io/projected/e05bc7f5-efe6-462b-91e6-07f5113858a9-kube-api-access-xtdlv\") pod \"e05bc7f5-efe6-462b-91e6-07f5113858a9\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.118042 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run-ovn\") pod \"e05bc7f5-efe6-462b-91e6-07f5113858a9\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.118203 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e05bc7f5-efe6-462b-91e6-07f5113858a9" (UID: "e05bc7f5-efe6-462b-91e6-07f5113858a9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.118263 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6784bd0f-0863-4990-bc78-c04561fbd465-operator-scripts\") pod \"6784bd0f-0863-4990-bc78-c04561fbd465\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.118360 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-scripts\") pod \"e05bc7f5-efe6-462b-91e6-07f5113858a9\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.118943 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6784bd0f-0863-4990-bc78-c04561fbd465-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6784bd0f-0863-4990-bc78-c04561fbd465" (UID: "6784bd0f-0863-4990-bc78-c04561fbd465"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.119214 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e05bc7f5-efe6-462b-91e6-07f5113858a9" (UID: "e05bc7f5-efe6-462b-91e6-07f5113858a9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.119939 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-log-ovn\") pod \"e05bc7f5-efe6-462b-91e6-07f5113858a9\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.120013 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgjmx\" (UniqueName: \"kubernetes.io/projected/6784bd0f-0863-4990-bc78-c04561fbd465-kube-api-access-cgjmx\") pod \"6784bd0f-0863-4990-bc78-c04561fbd465\" (UID: \"6784bd0f-0863-4990-bc78-c04561fbd465\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.120051 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run\") pod \"e05bc7f5-efe6-462b-91e6-07f5113858a9\" (UID: \"e05bc7f5-efe6-462b-91e6-07f5113858a9\") " Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.120327 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-scripts" (OuterVolumeSpecName: "scripts") pod "e05bc7f5-efe6-462b-91e6-07f5113858a9" (UID: "e05bc7f5-efe6-462b-91e6-07f5113858a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.120418 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run" (OuterVolumeSpecName: "var-run") pod "e05bc7f5-efe6-462b-91e6-07f5113858a9" (UID: "e05bc7f5-efe6-462b-91e6-07f5113858a9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.120448 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e05bc7f5-efe6-462b-91e6-07f5113858a9" (UID: "e05bc7f5-efe6-462b-91e6-07f5113858a9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.120735 4725 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.121091 4725 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.121166 4725 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.121227 4725 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e05bc7f5-efe6-462b-91e6-07f5113858a9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.121294 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6784bd0f-0863-4990-bc78-c04561fbd465-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.121360 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e05bc7f5-efe6-462b-91e6-07f5113858a9-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.125162 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6784bd0f-0863-4990-bc78-c04561fbd465-kube-api-access-cgjmx" (OuterVolumeSpecName: "kube-api-access-cgjmx") pod "6784bd0f-0863-4990-bc78-c04561fbd465" (UID: "6784bd0f-0863-4990-bc78-c04561fbd465"). InnerVolumeSpecName "kube-api-access-cgjmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.129094 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05bc7f5-efe6-462b-91e6-07f5113858a9-kube-api-access-xtdlv" (OuterVolumeSpecName: "kube-api-access-xtdlv") pod "e05bc7f5-efe6-462b-91e6-07f5113858a9" (UID: "e05bc7f5-efe6-462b-91e6-07f5113858a9"). InnerVolumeSpecName "kube-api-access-xtdlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.222530 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xpvnr" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.222960 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtdlv\" (UniqueName: \"kubernetes.io/projected/e05bc7f5-efe6-462b-91e6-07f5113858a9-kube-api-access-xtdlv\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.223120 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgjmx\" (UniqueName: \"kubernetes.io/projected/6784bd0f-0863-4990-bc78-c04561fbd465-kube-api-access-cgjmx\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.488147 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xpvnr-config-qzwx5" event={"ID":"e05bc7f5-efe6-462b-91e6-07f5113858a9","Type":"ContainerDied","Data":"84bd527a339874f51458ec9df62b680f178db5c68c149d9e9b697f0cf81f799a"} Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.488213 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84bd527a339874f51458ec9df62b680f178db5c68c149d9e9b697f0cf81f799a" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.488171 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xpvnr-config-qzwx5" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.489984 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zf9sp" event={"ID":"6784bd0f-0863-4990-bc78-c04561fbd465","Type":"ContainerDied","Data":"ed65cf7db228124520de1bf5d6742c214d3a950a62bba7042e37f4b761ebc092"} Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.490019 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed65cf7db228124520de1bf5d6742c214d3a950a62bba7042e37f4b761ebc092" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.490055 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zf9sp" Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.495869 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"7529f9eba974e1d4346e188c642753b0ac3733bc69505ced751ede4fb78f775c"} Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.495912 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"dcb92831cdb178c164cfdf12c1b4bb66db1422aef32f14823b4ac4b3ea2eca46"} Feb 25 11:11:43 crc kubenswrapper[4725]: I0225 11:11:43.495925 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"e89348328191fdced897693431483ba75d222831431909a5f33045e1b92d29c0"} Feb 25 11:11:44 crc kubenswrapper[4725]: I0225 11:11:44.149848 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xpvnr-config-qzwx5"] Feb 25 11:11:44 crc kubenswrapper[4725]: I0225 11:11:44.162481 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xpvnr-config-qzwx5"] Feb 25 11:11:45 crc kubenswrapper[4725]: I0225 11:11:45.248504 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05bc7f5-efe6-462b-91e6-07f5113858a9" path="/var/lib/kubelet/pods/e05bc7f5-efe6-462b-91e6-07f5113858a9/volumes" Feb 25 11:11:49 crc kubenswrapper[4725]: I0225 11:11:49.251554 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 25 11:11:49 crc kubenswrapper[4725]: I0225 11:11:49.552929 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"90d2536b3f8d1f1e712b085d37bfef405a6878d3548803cc7220a5ffbd9c0b78"} Feb 25 11:11:49 crc kubenswrapper[4725]: I0225 11:11:49.553160 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"373d5503ce74e1a34b91c524a1fa6654ea99f71086ff3215d651417cae988eb1"} Feb 25 11:11:49 crc kubenswrapper[4725]: I0225 11:11:49.553170 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"1990cbc39870f4e78de2afd36383ddb1d4e8bebbbf1115fb20568c671288253a"} Feb 25 11:11:49 crc kubenswrapper[4725]: I0225 11:11:49.553177 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"f23b33a5ae1b9388b8fe625fd88497ff89f917fc3eced1e170d40f550794dc24"} Feb 25 11:11:49 crc kubenswrapper[4725]: I0225 11:11:49.553187 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"b7eb5568017e1368c2501efb495fb9655ad89af9b6c4d1ead0b5c05318ddc7f7"} Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.567936 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"45a958a7b4b5334ad242c2f4e0f4652f41d6fab26596bce6382c3be7e9b054e6"} Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.568338 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d922deba-d455-45a7-ade3-dc2f588617bc","Type":"ContainerStarted","Data":"33f1165f8911fc9e4333f75035bf427688b1922e8a613d28b0ce469ab6cc81d4"} Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.628184 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=27.57213419 podStartE2EDuration="36.628163278s" podCreationTimestamp="2026-02-25 11:11:14 +0000 UTC" firstStartedPulling="2026-02-25 11:11:39.518385721 +0000 UTC m=+1125.016967756" lastFinishedPulling="2026-02-25 11:11:48.574414819 +0000 UTC m=+1134.072996844" observedRunningTime="2026-02-25 11:11:50.622859077 +0000 UTC m=+1136.121441112" watchObservedRunningTime="2026-02-25 11:11:50.628163278 +0000 UTC m=+1136.126745313" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.887474 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-dzv59"] Feb 25 11:11:50 crc kubenswrapper[4725]: E0225 11:11:50.887768 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05bc7f5-efe6-462b-91e6-07f5113858a9" containerName="ovn-config" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.887784 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05bc7f5-efe6-462b-91e6-07f5113858a9" containerName="ovn-config" Feb 25 11:11:50 crc kubenswrapper[4725]: E0225 11:11:50.887811 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5574881-8546-456a-96b2-d58158e8a447" containerName="swift-ring-rebalance" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.887818 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5574881-8546-456a-96b2-d58158e8a447" containerName="swift-ring-rebalance" Feb 25 11:11:50 crc kubenswrapper[4725]: E0225 11:11:50.887845 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6784bd0f-0863-4990-bc78-c04561fbd465" containerName="mariadb-account-create-update" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.887852 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6784bd0f-0863-4990-bc78-c04561fbd465" containerName="mariadb-account-create-update" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.888001 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05bc7f5-efe6-462b-91e6-07f5113858a9" containerName="ovn-config" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.888017 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5574881-8546-456a-96b2-d58158e8a447" containerName="swift-ring-rebalance" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.888026 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6784bd0f-0863-4990-bc78-c04561fbd465" containerName="mariadb-account-create-update" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.888771 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.890632 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.905455 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-dzv59"] Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.949956 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.950175 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.950380 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-config\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.950442 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bb6\" (UniqueName: \"kubernetes.io/projected/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-kube-api-access-b8bb6\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.950508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:50 crc kubenswrapper[4725]: I0225 11:11:50.950533 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-svc\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.052386 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.052539 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.052591 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-config\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.052621 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bb6\" (UniqueName: \"kubernetes.io/projected/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-kube-api-access-b8bb6\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.052657 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.052680 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-svc\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.053881 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-config\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.053946 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-svc\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.054130 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.054134 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.054549 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.079874 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bb6\" (UniqueName: \"kubernetes.io/projected/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-kube-api-access-b8bb6\") pod \"dnsmasq-dns-764c5664d7-dzv59\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.207670 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:51 crc kubenswrapper[4725]: I0225 11:11:51.743518 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-dzv59"] Feb 25 11:11:52 crc kubenswrapper[4725]: I0225 11:11:52.584152 4725 generic.go:334] "Generic (PLEG): container finished" podID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerID="5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e" exitCode=0 Feb 25 11:11:52 crc kubenswrapper[4725]: I0225 11:11:52.584214 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" event={"ID":"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c","Type":"ContainerDied","Data":"5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e"} Feb 25 11:11:52 crc kubenswrapper[4725]: I0225 11:11:52.584494 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" event={"ID":"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c","Type":"ContainerStarted","Data":"9ca6c20b09fd22c986baab37c2b2fe8ebe8dec0be7dae97ca486ec9c08f95402"} Feb 25 11:11:53 crc kubenswrapper[4725]: I0225 11:11:53.593408 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" event={"ID":"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c","Type":"ContainerStarted","Data":"4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64"} Feb 25 11:11:53 crc kubenswrapper[4725]: I0225 11:11:53.593918 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:53 crc kubenswrapper[4725]: I0225 11:11:53.595483 4725 generic.go:334] "Generic (PLEG): container finished" podID="88fc0f62-6868-40e9-a04e-5de23ca3e5fe" containerID="3c5dcef313b89405ca14bcd4691e17176c33f08f540c16811345ad7cd839f737" exitCode=0 Feb 25 11:11:53 crc kubenswrapper[4725]: I0225 11:11:53.595563 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jtvbl" event={"ID":"88fc0f62-6868-40e9-a04e-5de23ca3e5fe","Type":"ContainerDied","Data":"3c5dcef313b89405ca14bcd4691e17176c33f08f540c16811345ad7cd839f737"} Feb 25 11:11:53 crc kubenswrapper[4725]: I0225 11:11:53.617579 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" podStartSLOduration=3.617556649 podStartE2EDuration="3.617556649s" podCreationTimestamp="2026-02-25 11:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:53.610489442 +0000 UTC m=+1139.109071477" watchObservedRunningTime="2026-02-25 11:11:53.617556649 +0000 UTC m=+1139.116138674" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.080699 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.250794 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-config-data\") pod \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.251016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kd5w\" (UniqueName: \"kubernetes.io/projected/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-kube-api-access-5kd5w\") pod \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.251163 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-combined-ca-bundle\") pod \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.251218 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-db-sync-config-data\") pod \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\" (UID: \"88fc0f62-6868-40e9-a04e-5de23ca3e5fe\") " Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.257401 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "88fc0f62-6868-40e9-a04e-5de23ca3e5fe" (UID: "88fc0f62-6868-40e9-a04e-5de23ca3e5fe"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.258527 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-kube-api-access-5kd5w" (OuterVolumeSpecName: "kube-api-access-5kd5w") pod "88fc0f62-6868-40e9-a04e-5de23ca3e5fe" (UID: "88fc0f62-6868-40e9-a04e-5de23ca3e5fe"). InnerVolumeSpecName "kube-api-access-5kd5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.288641 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88fc0f62-6868-40e9-a04e-5de23ca3e5fe" (UID: "88fc0f62-6868-40e9-a04e-5de23ca3e5fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.319708 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-config-data" (OuterVolumeSpecName: "config-data") pod "88fc0f62-6868-40e9-a04e-5de23ca3e5fe" (UID: "88fc0f62-6868-40e9-a04e-5de23ca3e5fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.353047 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.353078 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kd5w\" (UniqueName: \"kubernetes.io/projected/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-kube-api-access-5kd5w\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.353089 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.353097 4725 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/88fc0f62-6868-40e9-a04e-5de23ca3e5fe-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.618813 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jtvbl" event={"ID":"88fc0f62-6868-40e9-a04e-5de23ca3e5fe","Type":"ContainerDied","Data":"32f91a7f72a8591c7caeaa56b397706991b4f19687be28e10b50a4b136521e91"} Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.619053 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32f91a7f72a8591c7caeaa56b397706991b4f19687be28e10b50a4b136521e91" Feb 25 11:11:55 crc kubenswrapper[4725]: I0225 11:11:55.619213 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jtvbl" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.028477 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-dzv59"] Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.029048 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" podUID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerName="dnsmasq-dns" containerID="cri-o://4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64" gracePeriod=10 Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.062629 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mrq7b"] Feb 25 11:11:56 crc kubenswrapper[4725]: E0225 11:11:56.062953 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fc0f62-6868-40e9-a04e-5de23ca3e5fe" containerName="glance-db-sync" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.062970 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fc0f62-6868-40e9-a04e-5de23ca3e5fe" containerName="glance-db-sync" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.063191 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fc0f62-6868-40e9-a04e-5de23ca3e5fe" containerName="glance-db-sync" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.064116 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.074680 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mrq7b"] Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.164381 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.164467 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.164515 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.164538 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nd2k\" (UniqueName: \"kubernetes.io/projected/652ed68d-108a-459a-8493-bb798b194940-kube-api-access-8nd2k\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.164565 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.164591 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-config\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.266176 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nd2k\" (UniqueName: \"kubernetes.io/projected/652ed68d-108a-459a-8493-bb798b194940-kube-api-access-8nd2k\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.266248 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.266281 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-config\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.266409 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.266474 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.266524 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.267672 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.268054 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-config\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.268389 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.268810 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.269984 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.287896 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nd2k\" (UniqueName: \"kubernetes.io/projected/652ed68d-108a-459a-8493-bb798b194940-kube-api-access-8nd2k\") pod \"dnsmasq-dns-74f6bcbc87-mrq7b\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.413573 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.530209 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.596323 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-config\") pod \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.596402 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-svc\") pod \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.596442 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-sb\") pod \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.596488 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-swift-storage-0\") pod \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.596555 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8bb6\" (UniqueName: \"kubernetes.io/projected/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-kube-api-access-b8bb6\") pod \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.596573 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-nb\") pod \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\" (UID: \"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c\") " Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.609560 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-kube-api-access-b8bb6" (OuterVolumeSpecName: "kube-api-access-b8bb6") pod "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" (UID: "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c"). InnerVolumeSpecName "kube-api-access-b8bb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.671486 4725 generic.go:334] "Generic (PLEG): container finished" podID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerID="4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64" exitCode=0 Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.671537 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" event={"ID":"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c","Type":"ContainerDied","Data":"4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64"} Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.671567 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" event={"ID":"12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c","Type":"ContainerDied","Data":"9ca6c20b09fd22c986baab37c2b2fe8ebe8dec0be7dae97ca486ec9c08f95402"} Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.671585 4725 scope.go:117] "RemoveContainer" containerID="4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.672274 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-dzv59" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.683442 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" (UID: "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.696344 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" (UID: "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.698807 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8bb6\" (UniqueName: \"kubernetes.io/projected/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-kube-api-access-b8bb6\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.698856 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.698866 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.715990 4725 scope.go:117] "RemoveContainer" containerID="5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.726488 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-config" (OuterVolumeSpecName: "config") pod "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" (UID: "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.740169 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" (UID: "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.746817 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" (UID: "12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.768496 4725 scope.go:117] "RemoveContainer" containerID="4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64" Feb 25 11:11:56 crc kubenswrapper[4725]: E0225 11:11:56.773149 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64\": container with ID starting with 4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64 not found: ID does not exist" containerID="4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.773185 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64"} err="failed to get container status \"4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64\": rpc error: code = NotFound desc = could not find container \"4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64\": container with ID starting with 4083ad9ab7bf2fe90e3adf0d270a97cd3c49e09391e37d5e3ce09f3c8a915d64 not found: ID does not exist" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.773211 4725 scope.go:117] "RemoveContainer" containerID="5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e" Feb 25 11:11:56 crc kubenswrapper[4725]: E0225 11:11:56.780171 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e\": container with ID starting with 5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e not found: ID does not exist" containerID="5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.780231 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e"} err="failed to get container status \"5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e\": rpc error: code = NotFound desc = could not find container \"5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e\": container with ID starting with 5236d3e3ce7547abb8f79d1d7738d9331db6e030aa9859e64d32cfbe2bc25d7e not found: ID does not exist" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.801725 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.801759 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.801769 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:11:56 crc kubenswrapper[4725]: W0225 11:11:56.918435 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652ed68d_108a_459a_8493_bb798b194940.slice/crio-85294568e5a9d2e6566847bbbad31a89bc3513f9915dac0d6168ba334c2f49b9 WatchSource:0}: Error finding container 85294568e5a9d2e6566847bbbad31a89bc3513f9915dac0d6168ba334c2f49b9: Status 404 returned error can't find the container with id 85294568e5a9d2e6566847bbbad31a89bc3513f9915dac0d6168ba334c2f49b9 Feb 25 11:11:56 crc kubenswrapper[4725]: I0225 11:11:56.918533 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mrq7b"] Feb 25 11:11:57 crc kubenswrapper[4725]: I0225 11:11:57.019955 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-dzv59"] Feb 25 11:11:57 crc kubenswrapper[4725]: I0225 11:11:57.027143 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-dzv59"] Feb 25 11:11:57 crc kubenswrapper[4725]: I0225 11:11:57.239887 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" path="/var/lib/kubelet/pods/12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c/volumes" Feb 25 11:11:57 crc kubenswrapper[4725]: I0225 11:11:57.683181 4725 generic.go:334] "Generic (PLEG): container finished" podID="652ed68d-108a-459a-8493-bb798b194940" containerID="3115750fe611e697d5a909af4e3a31bf48b8f0e2ea525b29b833d0b1345582d5" exitCode=0 Feb 25 11:11:57 crc kubenswrapper[4725]: I0225 11:11:57.683243 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" event={"ID":"652ed68d-108a-459a-8493-bb798b194940","Type":"ContainerDied","Data":"3115750fe611e697d5a909af4e3a31bf48b8f0e2ea525b29b833d0b1345582d5"} Feb 25 11:11:57 crc kubenswrapper[4725]: I0225 11:11:57.683567 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" event={"ID":"652ed68d-108a-459a-8493-bb798b194940","Type":"ContainerStarted","Data":"85294568e5a9d2e6566847bbbad31a89bc3513f9915dac0d6168ba334c2f49b9"} Feb 25 11:11:58 crc kubenswrapper[4725]: I0225 11:11:58.693570 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" event={"ID":"652ed68d-108a-459a-8493-bb798b194940","Type":"ContainerStarted","Data":"742c379fd1cce6dd119e7b9543d95d51a4735d72b9453b172fd15b8c71a2bb4f"} Feb 25 11:11:58 crc kubenswrapper[4725]: I0225 11:11:58.693893 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:11:58 crc kubenswrapper[4725]: I0225 11:11:58.717554 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" podStartSLOduration=2.7175372639999997 podStartE2EDuration="2.717537264s" podCreationTimestamp="2026-02-25 11:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:11:58.712889261 +0000 UTC m=+1144.211471296" watchObservedRunningTime="2026-02-25 11:11:58.717537264 +0000 UTC m=+1144.216119289" Feb 25 11:11:58 crc kubenswrapper[4725]: I0225 11:11:58.980105 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.270014 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.318742 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6xkqp"] Feb 25 11:11:59 crc kubenswrapper[4725]: E0225 11:11:59.319084 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerName="init" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.319105 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerName="init" Feb 25 11:11:59 crc kubenswrapper[4725]: E0225 11:11:59.319135 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerName="dnsmasq-dns" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.319143 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerName="dnsmasq-dns" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.319285 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="12810ce4-2821-4e8d-b0ea-b2f8f22ccd0c" containerName="dnsmasq-dns" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.319768 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.333209 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6xkqp"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.340951 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7f38-account-create-update-ggt5m"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.341996 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.344252 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.360857 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7f38-account-create-update-ggt5m"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.372497 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43106b29-d57b-47d4-90dd-9ea16422dc05-operator-scripts\") pod \"cinder-db-create-6xkqp\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.372628 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npvlr\" (UniqueName: \"kubernetes.io/projected/43106b29-d57b-47d4-90dd-9ea16422dc05-kube-api-access-npvlr\") pod \"cinder-db-create-6xkqp\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.473880 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f83fe8-1a7b-4411-9dc3-611c0affe393-operator-scripts\") pod \"cinder-7f38-account-create-update-ggt5m\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.474394 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43106b29-d57b-47d4-90dd-9ea16422dc05-operator-scripts\") pod \"cinder-db-create-6xkqp\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.474658 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npvlr\" (UniqueName: \"kubernetes.io/projected/43106b29-d57b-47d4-90dd-9ea16422dc05-kube-api-access-npvlr\") pod \"cinder-db-create-6xkqp\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.474715 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95x6h\" (UniqueName: \"kubernetes.io/projected/17f83fe8-1a7b-4411-9dc3-611c0affe393-kube-api-access-95x6h\") pod \"cinder-7f38-account-create-update-ggt5m\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.474991 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43106b29-d57b-47d4-90dd-9ea16422dc05-operator-scripts\") pod \"cinder-db-create-6xkqp\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.498425 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npvlr\" (UniqueName: \"kubernetes.io/projected/43106b29-d57b-47d4-90dd-9ea16422dc05-kube-api-access-npvlr\") pod \"cinder-db-create-6xkqp\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.528180 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e785-account-create-update-hhpzr"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.529262 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.536799 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e785-account-create-update-hhpzr"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.541696 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.557391 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-9l6bp"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.563534 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.570420 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9l6bp"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.576347 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de8d476c-f390-4e03-a518-e0998e0586df-operator-scripts\") pod \"barbican-e785-account-create-update-hhpzr\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.576408 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95x6h\" (UniqueName: \"kubernetes.io/projected/17f83fe8-1a7b-4411-9dc3-611c0affe393-kube-api-access-95x6h\") pod \"cinder-7f38-account-create-update-ggt5m\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.576460 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vw4k\" (UniqueName: \"kubernetes.io/projected/de8d476c-f390-4e03-a518-e0998e0586df-kube-api-access-6vw4k\") pod \"barbican-e785-account-create-update-hhpzr\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.576498 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f83fe8-1a7b-4411-9dc3-611c0affe393-operator-scripts\") pod \"cinder-7f38-account-create-update-ggt5m\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.577169 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f83fe8-1a7b-4411-9dc3-611c0affe393-operator-scripts\") pod \"cinder-7f38-account-create-update-ggt5m\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.626409 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ghddn"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.627660 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ghddn" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.627672 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95x6h\" (UniqueName: \"kubernetes.io/projected/17f83fe8-1a7b-4411-9dc3-611c0affe393-kube-api-access-95x6h\") pod \"cinder-7f38-account-create-update-ggt5m\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.633530 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-thmbj"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.634451 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.637451 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6xkqp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.639926 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.640071 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bt58t" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.640721 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.641139 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.657786 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.663698 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ghddn"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678212 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b501c3bd-07f8-4780-8b55-14db55bc346f-operator-scripts\") pod \"neutron-db-create-ghddn\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " pod="openstack/neutron-db-create-ghddn" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678261 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vmjs\" (UniqueName: \"kubernetes.io/projected/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-kube-api-access-8vmjs\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678306 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jntvv\" (UniqueName: \"kubernetes.io/projected/b501c3bd-07f8-4780-8b55-14db55bc346f-kube-api-access-jntvv\") pod \"neutron-db-create-ghddn\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " pod="openstack/neutron-db-create-ghddn" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678349 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vw4k\" (UniqueName: \"kubernetes.io/projected/de8d476c-f390-4e03-a518-e0998e0586df-kube-api-access-6vw4k\") pod \"barbican-e785-account-create-update-hhpzr\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678398 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-combined-ca-bundle\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678431 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-config-data\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678501 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de8d476c-f390-4e03-a518-e0998e0586df-operator-scripts\") pod \"barbican-e785-account-create-update-hhpzr\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678527 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjmvw\" (UniqueName: \"kubernetes.io/projected/31ef9958-bb4e-4bf1-a118-d11d04bff97b-kube-api-access-zjmvw\") pod \"barbican-db-create-9l6bp\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.678561 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ef9958-bb4e-4bf1-a118-d11d04bff97b-operator-scripts\") pod \"barbican-db-create-9l6bp\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.681164 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de8d476c-f390-4e03-a518-e0998e0586df-operator-scripts\") pod \"barbican-e785-account-create-update-hhpzr\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.688601 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-thmbj"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.704820 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vw4k\" (UniqueName: \"kubernetes.io/projected/de8d476c-f390-4e03-a518-e0998e0586df-kube-api-access-6vw4k\") pod \"barbican-e785-account-create-update-hhpzr\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.737763 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1a81-account-create-update-7k7sp"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.738780 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.744436 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.755356 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1a81-account-create-update-7k7sp"] Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.779884 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vmjs\" (UniqueName: \"kubernetes.io/projected/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-kube-api-access-8vmjs\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.779956 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jntvv\" (UniqueName: \"kubernetes.io/projected/b501c3bd-07f8-4780-8b55-14db55bc346f-kube-api-access-jntvv\") pod \"neutron-db-create-ghddn\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " pod="openstack/neutron-db-create-ghddn" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.779990 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262e67d4-08ee-405a-aa45-14c222a8e9f1-operator-scripts\") pod \"neutron-1a81-account-create-update-7k7sp\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.780015 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxss4\" (UniqueName: \"kubernetes.io/projected/262e67d4-08ee-405a-aa45-14c222a8e9f1-kube-api-access-qxss4\") pod \"neutron-1a81-account-create-update-7k7sp\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.780055 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-combined-ca-bundle\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.780085 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-config-data\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.780155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjmvw\" (UniqueName: \"kubernetes.io/projected/31ef9958-bb4e-4bf1-a118-d11d04bff97b-kube-api-access-zjmvw\") pod \"barbican-db-create-9l6bp\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.780177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ef9958-bb4e-4bf1-a118-d11d04bff97b-operator-scripts\") pod \"barbican-db-create-9l6bp\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.780222 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b501c3bd-07f8-4780-8b55-14db55bc346f-operator-scripts\") pod \"neutron-db-create-ghddn\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " pod="openstack/neutron-db-create-ghddn" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.781245 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b501c3bd-07f8-4780-8b55-14db55bc346f-operator-scripts\") pod \"neutron-db-create-ghddn\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " pod="openstack/neutron-db-create-ghddn" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.781927 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ef9958-bb4e-4bf1-a118-d11d04bff97b-operator-scripts\") pod \"barbican-db-create-9l6bp\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.784492 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-combined-ca-bundle\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.785482 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-config-data\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.798357 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjmvw\" (UniqueName: \"kubernetes.io/projected/31ef9958-bb4e-4bf1-a118-d11d04bff97b-kube-api-access-zjmvw\") pod \"barbican-db-create-9l6bp\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.799971 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jntvv\" (UniqueName: \"kubernetes.io/projected/b501c3bd-07f8-4780-8b55-14db55bc346f-kube-api-access-jntvv\") pod \"neutron-db-create-ghddn\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " pod="openstack/neutron-db-create-ghddn" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.803200 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vmjs\" (UniqueName: \"kubernetes.io/projected/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-kube-api-access-8vmjs\") pod \"keystone-db-sync-thmbj\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " pod="openstack/keystone-db-sync-thmbj" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.864690 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.878931 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9l6bp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.882196 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262e67d4-08ee-405a-aa45-14c222a8e9f1-operator-scripts\") pod \"neutron-1a81-account-create-update-7k7sp\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.882264 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxss4\" (UniqueName: \"kubernetes.io/projected/262e67d4-08ee-405a-aa45-14c222a8e9f1-kube-api-access-qxss4\") pod \"neutron-1a81-account-create-update-7k7sp\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.883720 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262e67d4-08ee-405a-aa45-14c222a8e9f1-operator-scripts\") pod \"neutron-1a81-account-create-update-7k7sp\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.903200 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxss4\" (UniqueName: \"kubernetes.io/projected/262e67d4-08ee-405a-aa45-14c222a8e9f1-kube-api-access-qxss4\") pod \"neutron-1a81-account-create-update-7k7sp\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:11:59 crc kubenswrapper[4725]: I0225 11:11:59.959956 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ghddn" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.058099 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thmbj" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.073484 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.143060 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533632-msl4k"] Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.144153 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533632-msl4k" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.149976 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.150008 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.150390 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.152425 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533632-msl4k"] Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.186692 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svb42\" (UniqueName: \"kubernetes.io/projected/0b720fd7-adf3-460d-a61b-832c8c974dc0-kube-api-access-svb42\") pod \"auto-csr-approver-29533632-msl4k\" (UID: \"0b720fd7-adf3-460d-a61b-832c8c974dc0\") " pod="openshift-infra/auto-csr-approver-29533632-msl4k" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.259278 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7f38-account-create-update-ggt5m"] Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.269714 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6xkqp"] Feb 25 11:12:00 crc kubenswrapper[4725]: W0225 11:12:00.282338 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17f83fe8_1a7b_4411_9dc3_611c0affe393.slice/crio-30a9095e7b1c2dbab1afdc506ac73a5a53fd907c80d15488814caf4dcd0cdf36 WatchSource:0}: Error finding container 30a9095e7b1c2dbab1afdc506ac73a5a53fd907c80d15488814caf4dcd0cdf36: Status 404 returned error can't find the container with id 30a9095e7b1c2dbab1afdc506ac73a5a53fd907c80d15488814caf4dcd0cdf36 Feb 25 11:12:00 crc kubenswrapper[4725]: W0225 11:12:00.282572 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43106b29_d57b_47d4_90dd_9ea16422dc05.slice/crio-4f95484817edce0a624786d998f939bcc5b8754457823c134628e19f9793c857 WatchSource:0}: Error finding container 4f95484817edce0a624786d998f939bcc5b8754457823c134628e19f9793c857: Status 404 returned error can't find the container with id 4f95484817edce0a624786d998f939bcc5b8754457823c134628e19f9793c857 Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.288744 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svb42\" (UniqueName: \"kubernetes.io/projected/0b720fd7-adf3-460d-a61b-832c8c974dc0-kube-api-access-svb42\") pod \"auto-csr-approver-29533632-msl4k\" (UID: \"0b720fd7-adf3-460d-a61b-832c8c974dc0\") " pod="openshift-infra/auto-csr-approver-29533632-msl4k" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.312345 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svb42\" (UniqueName: \"kubernetes.io/projected/0b720fd7-adf3-460d-a61b-832c8c974dc0-kube-api-access-svb42\") pod \"auto-csr-approver-29533632-msl4k\" (UID: \"0b720fd7-adf3-460d-a61b-832c8c974dc0\") " pod="openshift-infra/auto-csr-approver-29533632-msl4k" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.466000 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e785-account-create-update-hhpzr"] Feb 25 11:12:00 crc kubenswrapper[4725]: W0225 11:12:00.479458 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde8d476c_f390_4e03_a518_e0998e0586df.slice/crio-3238f812b21a95b3c4b74a4ee53e7f7707fab6e3be1c1d0c4ca23118c2ef012e WatchSource:0}: Error finding container 3238f812b21a95b3c4b74a4ee53e7f7707fab6e3be1c1d0c4ca23118c2ef012e: Status 404 returned error can't find the container with id 3238f812b21a95b3c4b74a4ee53e7f7707fab6e3be1c1d0c4ca23118c2ef012e Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.485235 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9l6bp"] Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.517226 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533632-msl4k" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.645441 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-thmbj"] Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.752649 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6xkqp" event={"ID":"43106b29-d57b-47d4-90dd-9ea16422dc05","Type":"ContainerStarted","Data":"04fe06c2b6d673e2ef30572a7e476f5fda8aecca44a89396f766f1cc9d31f721"} Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.753075 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6xkqp" event={"ID":"43106b29-d57b-47d4-90dd-9ea16422dc05","Type":"ContainerStarted","Data":"4f95484817edce0a624786d998f939bcc5b8754457823c134628e19f9793c857"} Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.769424 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ghddn"] Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.772876 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9l6bp" event={"ID":"31ef9958-bb4e-4bf1-a118-d11d04bff97b","Type":"ContainerStarted","Data":"13d3a702c48d6dd07b3a4e4bf9cf356554690fd1ac6383902736fbc66a55df47"} Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.792938 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1a81-account-create-update-7k7sp"] Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.801375 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f38-account-create-update-ggt5m" event={"ID":"17f83fe8-1a7b-4411-9dc3-611c0affe393","Type":"ContainerStarted","Data":"aa58b712619d1f2d6db3ad97f9a2389e783cd9d557280bc73843561d8ea6bbc1"} Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.801421 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f38-account-create-update-ggt5m" event={"ID":"17f83fe8-1a7b-4411-9dc3-611c0affe393","Type":"ContainerStarted","Data":"30a9095e7b1c2dbab1afdc506ac73a5a53fd907c80d15488814caf4dcd0cdf36"} Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.803125 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-6xkqp" podStartSLOduration=1.803111614 podStartE2EDuration="1.803111614s" podCreationTimestamp="2026-02-25 11:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:00.781325188 +0000 UTC m=+1146.279907223" watchObservedRunningTime="2026-02-25 11:12:00.803111614 +0000 UTC m=+1146.301693639" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.807514 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thmbj" event={"ID":"6a4bfbae-237f-4d52-9b5d-f47217a2c88c","Type":"ContainerStarted","Data":"99b3a4f0e6e44b8c8ac81161e67da1a1defafa4fed98274ef4d45d92be101489"} Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.818215 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e785-account-create-update-hhpzr" event={"ID":"de8d476c-f390-4e03-a518-e0998e0586df","Type":"ContainerStarted","Data":"3238f812b21a95b3c4b74a4ee53e7f7707fab6e3be1c1d0c4ca23118c2ef012e"} Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.832367 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7f38-account-create-update-ggt5m" podStartSLOduration=1.8323482370000002 podStartE2EDuration="1.832348237s" podCreationTimestamp="2026-02-25 11:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:00.819111857 +0000 UTC m=+1146.317693882" watchObservedRunningTime="2026-02-25 11:12:00.832348237 +0000 UTC m=+1146.330930262" Feb 25 11:12:00 crc kubenswrapper[4725]: I0225 11:12:00.925269 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533632-msl4k"] Feb 25 11:12:00 crc kubenswrapper[4725]: W0225 11:12:00.954508 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b720fd7_adf3_460d_a61b_832c8c974dc0.slice/crio-a34e69f71c1dc53bb9bc9f66ce0e65ec677738afcc0e52a342168050d1d0e2cf WatchSource:0}: Error finding container a34e69f71c1dc53bb9bc9f66ce0e65ec677738afcc0e52a342168050d1d0e2cf: Status 404 returned error can't find the container with id a34e69f71c1dc53bb9bc9f66ce0e65ec677738afcc0e52a342168050d1d0e2cf Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.838635 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533632-msl4k" event={"ID":"0b720fd7-adf3-460d-a61b-832c8c974dc0","Type":"ContainerStarted","Data":"a34e69f71c1dc53bb9bc9f66ce0e65ec677738afcc0e52a342168050d1d0e2cf"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.840238 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ghddn" event={"ID":"b501c3bd-07f8-4780-8b55-14db55bc346f","Type":"ContainerStarted","Data":"4d89a61cf32541dd22b107d0313929e2b1347111d44d1af4d344b8ea2e1aa9ec"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.840262 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ghddn" event={"ID":"b501c3bd-07f8-4780-8b55-14db55bc346f","Type":"ContainerStarted","Data":"4f031089282551f81052485effaa43b3d47fd23acccad0fcd17fe725be89cad7"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.845241 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e785-account-create-update-hhpzr" event={"ID":"de8d476c-f390-4e03-a518-e0998e0586df","Type":"ContainerStarted","Data":"32d8bb86ee8f58e701702a0b106c6c70953402777c533db91beae7134a2416af"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.847213 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a81-account-create-update-7k7sp" event={"ID":"262e67d4-08ee-405a-aa45-14c222a8e9f1","Type":"ContainerStarted","Data":"806db84677e11365b8ea664801ada795eed2f35ed7ced59f3563df3bf8411ec5"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.847237 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a81-account-create-update-7k7sp" event={"ID":"262e67d4-08ee-405a-aa45-14c222a8e9f1","Type":"ContainerStarted","Data":"563546219d7587a6ca1fe4575a265c63560d979cfa6c90296e63a5ba369bd95d"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.849846 4725 generic.go:334] "Generic (PLEG): container finished" podID="43106b29-d57b-47d4-90dd-9ea16422dc05" containerID="04fe06c2b6d673e2ef30572a7e476f5fda8aecca44a89396f766f1cc9d31f721" exitCode=0 Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.849911 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6xkqp" event={"ID":"43106b29-d57b-47d4-90dd-9ea16422dc05","Type":"ContainerDied","Data":"04fe06c2b6d673e2ef30572a7e476f5fda8aecca44a89396f766f1cc9d31f721"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.851596 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9l6bp" event={"ID":"31ef9958-bb4e-4bf1-a118-d11d04bff97b","Type":"ContainerStarted","Data":"08157017ad2ccea94a32e94385eaaf166984488b53f4b13f17f761edb2ec0134"} Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.864515 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-ghddn" podStartSLOduration=2.864500992 podStartE2EDuration="2.864500992s" podCreationTimestamp="2026-02-25 11:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:01.860304371 +0000 UTC m=+1147.358886406" watchObservedRunningTime="2026-02-25 11:12:01.864500992 +0000 UTC m=+1147.363083017" Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.878933 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-9l6bp" podStartSLOduration=2.878915063 podStartE2EDuration="2.878915063s" podCreationTimestamp="2026-02-25 11:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:01.871176038 +0000 UTC m=+1147.369758083" watchObservedRunningTime="2026-02-25 11:12:01.878915063 +0000 UTC m=+1147.377497088" Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.918515 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-e785-account-create-update-hhpzr" podStartSLOduration=2.918486129 podStartE2EDuration="2.918486129s" podCreationTimestamp="2026-02-25 11:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:01.903162464 +0000 UTC m=+1147.401744489" watchObservedRunningTime="2026-02-25 11:12:01.918486129 +0000 UTC m=+1147.417068164" Feb 25 11:12:01 crc kubenswrapper[4725]: I0225 11:12:01.928775 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1a81-account-create-update-7k7sp" podStartSLOduration=2.9287488809999997 podStartE2EDuration="2.928748881s" podCreationTimestamp="2026-02-25 11:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:01.919486446 +0000 UTC m=+1147.418068491" watchObservedRunningTime="2026-02-25 11:12:01.928748881 +0000 UTC m=+1147.427330926" Feb 25 11:12:02 crc kubenswrapper[4725]: I0225 11:12:02.867446 4725 generic.go:334] "Generic (PLEG): container finished" podID="b501c3bd-07f8-4780-8b55-14db55bc346f" containerID="4d89a61cf32541dd22b107d0313929e2b1347111d44d1af4d344b8ea2e1aa9ec" exitCode=0 Feb 25 11:12:02 crc kubenswrapper[4725]: I0225 11:12:02.867494 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ghddn" event={"ID":"b501c3bd-07f8-4780-8b55-14db55bc346f","Type":"ContainerDied","Data":"4d89a61cf32541dd22b107d0313929e2b1347111d44d1af4d344b8ea2e1aa9ec"} Feb 25 11:12:02 crc kubenswrapper[4725]: I0225 11:12:02.869510 4725 generic.go:334] "Generic (PLEG): container finished" podID="31ef9958-bb4e-4bf1-a118-d11d04bff97b" containerID="08157017ad2ccea94a32e94385eaaf166984488b53f4b13f17f761edb2ec0134" exitCode=0 Feb 25 11:12:02 crc kubenswrapper[4725]: I0225 11:12:02.869590 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9l6bp" event={"ID":"31ef9958-bb4e-4bf1-a118-d11d04bff97b","Type":"ContainerDied","Data":"08157017ad2ccea94a32e94385eaaf166984488b53f4b13f17f761edb2ec0134"} Feb 25 11:12:03 crc kubenswrapper[4725]: I0225 11:12:03.880036 4725 generic.go:334] "Generic (PLEG): container finished" podID="de8d476c-f390-4e03-a518-e0998e0586df" containerID="32d8bb86ee8f58e701702a0b106c6c70953402777c533db91beae7134a2416af" exitCode=0 Feb 25 11:12:03 crc kubenswrapper[4725]: I0225 11:12:03.880254 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e785-account-create-update-hhpzr" event={"ID":"de8d476c-f390-4e03-a518-e0998e0586df","Type":"ContainerDied","Data":"32d8bb86ee8f58e701702a0b106c6c70953402777c533db91beae7134a2416af"} Feb 25 11:12:03 crc kubenswrapper[4725]: I0225 11:12:03.882274 4725 generic.go:334] "Generic (PLEG): container finished" podID="262e67d4-08ee-405a-aa45-14c222a8e9f1" containerID="806db84677e11365b8ea664801ada795eed2f35ed7ced59f3563df3bf8411ec5" exitCode=0 Feb 25 11:12:03 crc kubenswrapper[4725]: I0225 11:12:03.882354 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a81-account-create-update-7k7sp" event={"ID":"262e67d4-08ee-405a-aa45-14c222a8e9f1","Type":"ContainerDied","Data":"806db84677e11365b8ea664801ada795eed2f35ed7ced59f3563df3bf8411ec5"} Feb 25 11:12:03 crc kubenswrapper[4725]: I0225 11:12:03.883639 4725 generic.go:334] "Generic (PLEG): container finished" podID="17f83fe8-1a7b-4411-9dc3-611c0affe393" containerID="aa58b712619d1f2d6db3ad97f9a2389e783cd9d557280bc73843561d8ea6bbc1" exitCode=0 Feb 25 11:12:03 crc kubenswrapper[4725]: I0225 11:12:03.883795 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f38-account-create-update-ggt5m" event={"ID":"17f83fe8-1a7b-4411-9dc3-611c0affe393","Type":"ContainerDied","Data":"aa58b712619d1f2d6db3ad97f9a2389e783cd9d557280bc73843561d8ea6bbc1"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.361679 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9l6bp" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.385142 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.412327 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.415839 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.432139 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjmvw\" (UniqueName: \"kubernetes.io/projected/31ef9958-bb4e-4bf1-a118-d11d04bff97b-kube-api-access-zjmvw\") pod \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.432274 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95x6h\" (UniqueName: \"kubernetes.io/projected/17f83fe8-1a7b-4411-9dc3-611c0affe393-kube-api-access-95x6h\") pod \"17f83fe8-1a7b-4411-9dc3-611c0affe393\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.432416 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ef9958-bb4e-4bf1-a118-d11d04bff97b-operator-scripts\") pod \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\" (UID: \"31ef9958-bb4e-4bf1-a118-d11d04bff97b\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.433166 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f83fe8-1a7b-4411-9dc3-611c0affe393-operator-scripts\") pod \"17f83fe8-1a7b-4411-9dc3-611c0affe393\" (UID: \"17f83fe8-1a7b-4411-9dc3-611c0affe393\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.434018 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17f83fe8-1a7b-4411-9dc3-611c0affe393-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17f83fe8-1a7b-4411-9dc3-611c0affe393" (UID: "17f83fe8-1a7b-4411-9dc3-611c0affe393"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.439572 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31ef9958-bb4e-4bf1-a118-d11d04bff97b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31ef9958-bb4e-4bf1-a118-d11d04bff97b" (UID: "31ef9958-bb4e-4bf1-a118-d11d04bff97b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.441087 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.446165 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ef9958-bb4e-4bf1-a118-d11d04bff97b-kube-api-access-zjmvw" (OuterVolumeSpecName: "kube-api-access-zjmvw") pod "31ef9958-bb4e-4bf1-a118-d11d04bff97b" (UID: "31ef9958-bb4e-4bf1-a118-d11d04bff97b"). InnerVolumeSpecName "kube-api-access-zjmvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.447466 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f83fe8-1a7b-4411-9dc3-611c0affe393-kube-api-access-95x6h" (OuterVolumeSpecName: "kube-api-access-95x6h") pod "17f83fe8-1a7b-4411-9dc3-611c0affe393" (UID: "17f83fe8-1a7b-4411-9dc3-611c0affe393"). InnerVolumeSpecName "kube-api-access-95x6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.512591 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6xkqp" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.512244 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7frxh"] Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.515252 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-7frxh" podUID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerName="dnsmasq-dns" containerID="cri-o://d03c3df2831f435eec79aa4c11fd77f615f21dd1d257dba5c76fc719b708a1de" gracePeriod=10 Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.535570 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxss4\" (UniqueName: \"kubernetes.io/projected/262e67d4-08ee-405a-aa45-14c222a8e9f1-kube-api-access-qxss4\") pod \"262e67d4-08ee-405a-aa45-14c222a8e9f1\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.535715 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de8d476c-f390-4e03-a518-e0998e0586df-operator-scripts\") pod \"de8d476c-f390-4e03-a518-e0998e0586df\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.535743 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vw4k\" (UniqueName: \"kubernetes.io/projected/de8d476c-f390-4e03-a518-e0998e0586df-kube-api-access-6vw4k\") pod \"de8d476c-f390-4e03-a518-e0998e0586df\" (UID: \"de8d476c-f390-4e03-a518-e0998e0586df\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.535759 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262e67d4-08ee-405a-aa45-14c222a8e9f1-operator-scripts\") pod \"262e67d4-08ee-405a-aa45-14c222a8e9f1\" (UID: \"262e67d4-08ee-405a-aa45-14c222a8e9f1\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.536350 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f83fe8-1a7b-4411-9dc3-611c0affe393-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.536362 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjmvw\" (UniqueName: \"kubernetes.io/projected/31ef9958-bb4e-4bf1-a118-d11d04bff97b-kube-api-access-zjmvw\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.536372 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95x6h\" (UniqueName: \"kubernetes.io/projected/17f83fe8-1a7b-4411-9dc3-611c0affe393-kube-api-access-95x6h\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.536380 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ef9958-bb4e-4bf1-a118-d11d04bff97b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.537360 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/262e67d4-08ee-405a-aa45-14c222a8e9f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "262e67d4-08ee-405a-aa45-14c222a8e9f1" (UID: "262e67d4-08ee-405a-aa45-14c222a8e9f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.537774 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d476c-f390-4e03-a518-e0998e0586df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de8d476c-f390-4e03-a518-e0998e0586df" (UID: "de8d476c-f390-4e03-a518-e0998e0586df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.539293 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ghddn" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.541216 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8d476c-f390-4e03-a518-e0998e0586df-kube-api-access-6vw4k" (OuterVolumeSpecName: "kube-api-access-6vw4k") pod "de8d476c-f390-4e03-a518-e0998e0586df" (UID: "de8d476c-f390-4e03-a518-e0998e0586df"). InnerVolumeSpecName "kube-api-access-6vw4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.565672 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262e67d4-08ee-405a-aa45-14c222a8e9f1-kube-api-access-qxss4" (OuterVolumeSpecName: "kube-api-access-qxss4") pod "262e67d4-08ee-405a-aa45-14c222a8e9f1" (UID: "262e67d4-08ee-405a-aa45-14c222a8e9f1"). InnerVolumeSpecName "kube-api-access-qxss4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.637374 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jntvv\" (UniqueName: \"kubernetes.io/projected/b501c3bd-07f8-4780-8b55-14db55bc346f-kube-api-access-jntvv\") pod \"b501c3bd-07f8-4780-8b55-14db55bc346f\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.637458 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npvlr\" (UniqueName: \"kubernetes.io/projected/43106b29-d57b-47d4-90dd-9ea16422dc05-kube-api-access-npvlr\") pod \"43106b29-d57b-47d4-90dd-9ea16422dc05\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.637475 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43106b29-d57b-47d4-90dd-9ea16422dc05-operator-scripts\") pod \"43106b29-d57b-47d4-90dd-9ea16422dc05\" (UID: \"43106b29-d57b-47d4-90dd-9ea16422dc05\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.637536 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b501c3bd-07f8-4780-8b55-14db55bc346f-operator-scripts\") pod \"b501c3bd-07f8-4780-8b55-14db55bc346f\" (UID: \"b501c3bd-07f8-4780-8b55-14db55bc346f\") " Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.637874 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de8d476c-f390-4e03-a518-e0998e0586df-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.638513 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vw4k\" (UniqueName: \"kubernetes.io/projected/de8d476c-f390-4e03-a518-e0998e0586df-kube-api-access-6vw4k\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.638531 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262e67d4-08ee-405a-aa45-14c222a8e9f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.638540 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxss4\" (UniqueName: \"kubernetes.io/projected/262e67d4-08ee-405a-aa45-14c222a8e9f1-kube-api-access-qxss4\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.638205 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43106b29-d57b-47d4-90dd-9ea16422dc05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43106b29-d57b-47d4-90dd-9ea16422dc05" (UID: "43106b29-d57b-47d4-90dd-9ea16422dc05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.638454 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b501c3bd-07f8-4780-8b55-14db55bc346f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b501c3bd-07f8-4780-8b55-14db55bc346f" (UID: "b501c3bd-07f8-4780-8b55-14db55bc346f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.643963 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43106b29-d57b-47d4-90dd-9ea16422dc05-kube-api-access-npvlr" (OuterVolumeSpecName: "kube-api-access-npvlr") pod "43106b29-d57b-47d4-90dd-9ea16422dc05" (UID: "43106b29-d57b-47d4-90dd-9ea16422dc05"). InnerVolumeSpecName "kube-api-access-npvlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.644161 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b501c3bd-07f8-4780-8b55-14db55bc346f-kube-api-access-jntvv" (OuterVolumeSpecName: "kube-api-access-jntvv") pod "b501c3bd-07f8-4780-8b55-14db55bc346f" (UID: "b501c3bd-07f8-4780-8b55-14db55bc346f"). InnerVolumeSpecName "kube-api-access-jntvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.740023 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npvlr\" (UniqueName: \"kubernetes.io/projected/43106b29-d57b-47d4-90dd-9ea16422dc05-kube-api-access-npvlr\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.740061 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43106b29-d57b-47d4-90dd-9ea16422dc05-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.740071 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b501c3bd-07f8-4780-8b55-14db55bc346f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.740080 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jntvv\" (UniqueName: \"kubernetes.io/projected/b501c3bd-07f8-4780-8b55-14db55bc346f-kube-api-access-jntvv\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.914819 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a81-account-create-update-7k7sp" event={"ID":"262e67d4-08ee-405a-aa45-14c222a8e9f1","Type":"ContainerDied","Data":"563546219d7587a6ca1fe4575a265c63560d979cfa6c90296e63a5ba369bd95d"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.915109 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563546219d7587a6ca1fe4575a265c63560d979cfa6c90296e63a5ba369bd95d" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.915038 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a81-account-create-update-7k7sp" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.917819 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thmbj" event={"ID":"6a4bfbae-237f-4d52-9b5d-f47217a2c88c","Type":"ContainerStarted","Data":"71dc519a8394ebf396361c5207f79340bf5433e2dda989ab616b9bbd6e2d41ad"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.922054 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6xkqp" event={"ID":"43106b29-d57b-47d4-90dd-9ea16422dc05","Type":"ContainerDied","Data":"4f95484817edce0a624786d998f939bcc5b8754457823c134628e19f9793c857"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.922162 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f95484817edce0a624786d998f939bcc5b8754457823c134628e19f9793c857" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.922269 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6xkqp" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.935634 4725 generic.go:334] "Generic (PLEG): container finished" podID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerID="d03c3df2831f435eec79aa4c11fd77f615f21dd1d257dba5c76fc719b708a1de" exitCode=0 Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.935690 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7frxh" event={"ID":"9c296aab-4223-43bb-a032-45b20ffeaab5","Type":"ContainerDied","Data":"d03c3df2831f435eec79aa4c11fd77f615f21dd1d257dba5c76fc719b708a1de"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.935719 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7frxh" event={"ID":"9c296aab-4223-43bb-a032-45b20ffeaab5","Type":"ContainerDied","Data":"9675fdc0601c6c560e3010d592a48d6d395e495f27c1ebf599497090ecff60de"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.935733 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9675fdc0601c6c560e3010d592a48d6d395e495f27c1ebf599497090ecff60de" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.937218 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f38-account-create-update-ggt5m" event={"ID":"17f83fe8-1a7b-4411-9dc3-611c0affe393","Type":"ContainerDied","Data":"30a9095e7b1c2dbab1afdc506ac73a5a53fd907c80d15488814caf4dcd0cdf36"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.937238 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30a9095e7b1c2dbab1afdc506ac73a5a53fd907c80d15488814caf4dcd0cdf36" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.937289 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f38-account-create-update-ggt5m" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.939022 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ghddn" event={"ID":"b501c3bd-07f8-4780-8b55-14db55bc346f","Type":"ContainerDied","Data":"4f031089282551f81052485effaa43b3d47fd23acccad0fcd17fe725be89cad7"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.939053 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f031089282551f81052485effaa43b3d47fd23acccad0fcd17fe725be89cad7" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.939110 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ghddn" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.940398 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-thmbj" podStartSLOduration=2.429706263 podStartE2EDuration="7.940381438s" podCreationTimestamp="2026-02-25 11:11:59 +0000 UTC" firstStartedPulling="2026-02-25 11:12:00.67703376 +0000 UTC m=+1146.175615785" lastFinishedPulling="2026-02-25 11:12:06.187708935 +0000 UTC m=+1151.686290960" observedRunningTime="2026-02-25 11:12:06.934669577 +0000 UTC m=+1152.433251612" watchObservedRunningTime="2026-02-25 11:12:06.940381438 +0000 UTC m=+1152.438963463" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.945793 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e785-account-create-update-hhpzr" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.946276 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e785-account-create-update-hhpzr" event={"ID":"de8d476c-f390-4e03-a518-e0998e0586df","Type":"ContainerDied","Data":"3238f812b21a95b3c4b74a4ee53e7f7707fab6e3be1c1d0c4ca23118c2ef012e"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.946412 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3238f812b21a95b3c4b74a4ee53e7f7707fab6e3be1c1d0c4ca23118c2ef012e" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.952628 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9l6bp" event={"ID":"31ef9958-bb4e-4bf1-a118-d11d04bff97b","Type":"ContainerDied","Data":"13d3a702c48d6dd07b3a4e4bf9cf356554690fd1ac6383902736fbc66a55df47"} Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.952652 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d3a702c48d6dd07b3a4e4bf9cf356554690fd1ac6383902736fbc66a55df47" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.952700 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9l6bp" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.953098 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.956963 4725 generic.go:334] "Generic (PLEG): container finished" podID="0b720fd7-adf3-460d-a61b-832c8c974dc0" containerID="e534b69731edb04a186d0e470c9dd4206bc4dc71418fa371203f3a49f8f4ed68" exitCode=0 Feb 25 11:12:06 crc kubenswrapper[4725]: I0225 11:12:06.957013 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533632-msl4k" event={"ID":"0b720fd7-adf3-460d-a61b-832c8c974dc0","Type":"ContainerDied","Data":"e534b69731edb04a186d0e470c9dd4206bc4dc71418fa371203f3a49f8f4ed68"} Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.043717 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-config\") pod \"9c296aab-4223-43bb-a032-45b20ffeaab5\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.044022 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-nb\") pod \"9c296aab-4223-43bb-a032-45b20ffeaab5\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.044158 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-dns-svc\") pod \"9c296aab-4223-43bb-a032-45b20ffeaab5\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.044268 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-sb\") pod \"9c296aab-4223-43bb-a032-45b20ffeaab5\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.044415 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2zlv\" (UniqueName: \"kubernetes.io/projected/9c296aab-4223-43bb-a032-45b20ffeaab5-kube-api-access-v2zlv\") pod \"9c296aab-4223-43bb-a032-45b20ffeaab5\" (UID: \"9c296aab-4223-43bb-a032-45b20ffeaab5\") " Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.048169 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c296aab-4223-43bb-a032-45b20ffeaab5-kube-api-access-v2zlv" (OuterVolumeSpecName: "kube-api-access-v2zlv") pod "9c296aab-4223-43bb-a032-45b20ffeaab5" (UID: "9c296aab-4223-43bb-a032-45b20ffeaab5"). InnerVolumeSpecName "kube-api-access-v2zlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.088768 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-config" (OuterVolumeSpecName: "config") pod "9c296aab-4223-43bb-a032-45b20ffeaab5" (UID: "9c296aab-4223-43bb-a032-45b20ffeaab5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.090317 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c296aab-4223-43bb-a032-45b20ffeaab5" (UID: "9c296aab-4223-43bb-a032-45b20ffeaab5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.095715 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c296aab-4223-43bb-a032-45b20ffeaab5" (UID: "9c296aab-4223-43bb-a032-45b20ffeaab5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.104941 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c296aab-4223-43bb-a032-45b20ffeaab5" (UID: "9c296aab-4223-43bb-a032-45b20ffeaab5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.146558 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.146598 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.146610 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2zlv\" (UniqueName: \"kubernetes.io/projected/9c296aab-4223-43bb-a032-45b20ffeaab5-kube-api-access-v2zlv\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.146619 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.146628 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c296aab-4223-43bb-a032-45b20ffeaab5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:07 crc kubenswrapper[4725]: I0225 11:12:07.968032 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7frxh" Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.012119 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7frxh"] Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.029319 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7frxh"] Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.320365 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533632-msl4k" Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.376024 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svb42\" (UniqueName: \"kubernetes.io/projected/0b720fd7-adf3-460d-a61b-832c8c974dc0-kube-api-access-svb42\") pod \"0b720fd7-adf3-460d-a61b-832c8c974dc0\" (UID: \"0b720fd7-adf3-460d-a61b-832c8c974dc0\") " Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.381720 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b720fd7-adf3-460d-a61b-832c8c974dc0-kube-api-access-svb42" (OuterVolumeSpecName: "kube-api-access-svb42") pod "0b720fd7-adf3-460d-a61b-832c8c974dc0" (UID: "0b720fd7-adf3-460d-a61b-832c8c974dc0"). InnerVolumeSpecName "kube-api-access-svb42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.478350 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svb42\" (UniqueName: \"kubernetes.io/projected/0b720fd7-adf3-460d-a61b-832c8c974dc0-kube-api-access-svb42\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.979916 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533632-msl4k" event={"ID":"0b720fd7-adf3-460d-a61b-832c8c974dc0","Type":"ContainerDied","Data":"a34e69f71c1dc53bb9bc9f66ce0e65ec677738afcc0e52a342168050d1d0e2cf"} Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.979979 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a34e69f71c1dc53bb9bc9f66ce0e65ec677738afcc0e52a342168050d1d0e2cf" Feb 25 11:12:08 crc kubenswrapper[4725]: I0225 11:12:08.979984 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533632-msl4k" Feb 25 11:12:09 crc kubenswrapper[4725]: I0225 11:12:09.234473 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c296aab-4223-43bb-a032-45b20ffeaab5" path="/var/lib/kubelet/pods/9c296aab-4223-43bb-a032-45b20ffeaab5/volumes" Feb 25 11:12:09 crc kubenswrapper[4725]: I0225 11:12:09.389808 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533626-fvvbl"] Feb 25 11:12:09 crc kubenswrapper[4725]: I0225 11:12:09.395564 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533626-fvvbl"] Feb 25 11:12:09 crc kubenswrapper[4725]: I0225 11:12:09.988643 4725 generic.go:334] "Generic (PLEG): container finished" podID="6a4bfbae-237f-4d52-9b5d-f47217a2c88c" containerID="71dc519a8394ebf396361c5207f79340bf5433e2dda989ab616b9bbd6e2d41ad" exitCode=0 Feb 25 11:12:09 crc kubenswrapper[4725]: I0225 11:12:09.988686 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thmbj" event={"ID":"6a4bfbae-237f-4d52-9b5d-f47217a2c88c","Type":"ContainerDied","Data":"71dc519a8394ebf396361c5207f79340bf5433e2dda989ab616b9bbd6e2d41ad"} Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.234893 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddaa0363-1011-45ef-9e91-11054f3cb3c1" path="/var/lib/kubelet/pods/ddaa0363-1011-45ef-9e91-11054f3cb3c1/volumes" Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.362924 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thmbj" Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.425051 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vmjs\" (UniqueName: \"kubernetes.io/projected/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-kube-api-access-8vmjs\") pod \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.425138 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-config-data\") pod \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.425349 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-combined-ca-bundle\") pod \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\" (UID: \"6a4bfbae-237f-4d52-9b5d-f47217a2c88c\") " Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.431305 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-kube-api-access-8vmjs" (OuterVolumeSpecName: "kube-api-access-8vmjs") pod "6a4bfbae-237f-4d52-9b5d-f47217a2c88c" (UID: "6a4bfbae-237f-4d52-9b5d-f47217a2c88c"). InnerVolumeSpecName "kube-api-access-8vmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.448887 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a4bfbae-237f-4d52-9b5d-f47217a2c88c" (UID: "6a4bfbae-237f-4d52-9b5d-f47217a2c88c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.463430 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-config-data" (OuterVolumeSpecName: "config-data") pod "6a4bfbae-237f-4d52-9b5d-f47217a2c88c" (UID: "6a4bfbae-237f-4d52-9b5d-f47217a2c88c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.527903 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.527954 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:11 crc kubenswrapper[4725]: I0225 11:12:11.527973 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vmjs\" (UniqueName: \"kubernetes.io/projected/6a4bfbae-237f-4d52-9b5d-f47217a2c88c-kube-api-access-8vmjs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.009330 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thmbj" event={"ID":"6a4bfbae-237f-4d52-9b5d-f47217a2c88c","Type":"ContainerDied","Data":"99b3a4f0e6e44b8c8ac81161e67da1a1defafa4fed98274ef4d45d92be101489"} Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.009388 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99b3a4f0e6e44b8c8ac81161e67da1a1defafa4fed98274ef4d45d92be101489" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.009414 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thmbj" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.271623 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-d568j"] Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.272923 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8d476c-f390-4e03-a518-e0998e0586df" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.273215 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8d476c-f390-4e03-a518-e0998e0586df" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.273275 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerName="init" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.273368 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerName="init" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.273439 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ef9958-bb4e-4bf1-a118-d11d04bff97b" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.273491 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ef9958-bb4e-4bf1-a118-d11d04bff97b" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.273551 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43106b29-d57b-47d4-90dd-9ea16422dc05" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.273617 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="43106b29-d57b-47d4-90dd-9ea16422dc05" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.273681 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4bfbae-237f-4d52-9b5d-f47217a2c88c" containerName="keystone-db-sync" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.273738 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4bfbae-237f-4d52-9b5d-f47217a2c88c" containerName="keystone-db-sync" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.273797 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262e67d4-08ee-405a-aa45-14c222a8e9f1" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.273866 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="262e67d4-08ee-405a-aa45-14c222a8e9f1" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.273952 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerName="dnsmasq-dns" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274015 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerName="dnsmasq-dns" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.274079 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f83fe8-1a7b-4411-9dc3-611c0affe393" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274131 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f83fe8-1a7b-4411-9dc3-611c0affe393" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.274193 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b720fd7-adf3-460d-a61b-832c8c974dc0" containerName="oc" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274246 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b720fd7-adf3-460d-a61b-832c8c974dc0" containerName="oc" Feb 25 11:12:12 crc kubenswrapper[4725]: E0225 11:12:12.274312 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b501c3bd-07f8-4780-8b55-14db55bc346f" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274365 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b501c3bd-07f8-4780-8b55-14db55bc346f" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274547 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ef9958-bb4e-4bf1-a118-d11d04bff97b" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274607 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="43106b29-d57b-47d4-90dd-9ea16422dc05" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274661 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="262e67d4-08ee-405a-aa45-14c222a8e9f1" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274717 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4bfbae-237f-4d52-9b5d-f47217a2c88c" containerName="keystone-db-sync" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274787 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b720fd7-adf3-460d-a61b-832c8c974dc0" containerName="oc" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274863 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c296aab-4223-43bb-a032-45b20ffeaab5" containerName="dnsmasq-dns" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274932 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8d476c-f390-4e03-a518-e0998e0586df" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.274992 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f83fe8-1a7b-4411-9dc3-611c0affe393" containerName="mariadb-account-create-update" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.275063 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b501c3bd-07f8-4780-8b55-14db55bc346f" containerName="mariadb-database-create" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.275900 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.287720 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-d568j"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.308914 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8qqnx"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.310063 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.314439 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bt58t" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.314703 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.314766 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.314896 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.314817 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.340739 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.340942 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnc2\" (UniqueName: \"kubernetes.io/projected/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-kube-api-access-7gnc2\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.341102 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-svc\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.341198 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.341345 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.341455 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-config\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.350193 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8qqnx"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447077 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8jjd\" (UniqueName: \"kubernetes.io/projected/1319aae4-df52-49f2-8baf-3380d31994db-kube-api-access-f8jjd\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447137 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-config-data\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447191 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-svc\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447219 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-combined-ca-bundle\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447247 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447299 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447326 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-fernet-keys\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447385 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-credential-keys\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447411 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-config\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447452 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnc2\" (UniqueName: \"kubernetes.io/projected/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-kube-api-access-7gnc2\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447476 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.447534 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-scripts\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.448592 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-svc\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.449882 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.450639 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-config\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.451140 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.453253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.481015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnc2\" (UniqueName: \"kubernetes.io/projected/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-kube-api-access-7gnc2\") pod \"dnsmasq-dns-847c4cc679-d568j\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.491259 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d9dc4c7c7-dqrgs"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.492789 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.500386 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.500655 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.500803 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.500969 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jvrgc" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.518534 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d9dc4c7c7-dqrgs"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549741 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-combined-ca-bundle\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549794 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-config-data\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549835 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-scripts\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549853 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c3886b-35cd-47aa-aa75-6a23a593eba9-logs\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549872 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-fernet-keys\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549906 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18c3886b-35cd-47aa-aa75-6a23a593eba9-horizon-secret-key\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549921 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p92b5\" (UniqueName: \"kubernetes.io/projected/18c3886b-35cd-47aa-aa75-6a23a593eba9-kube-api-access-p92b5\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549944 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-credential-keys\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.549993 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-scripts\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.550020 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-config-data\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.550037 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8jjd\" (UniqueName: \"kubernetes.io/projected/1319aae4-df52-49f2-8baf-3380d31994db-kube-api-access-f8jjd\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.557243 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-combined-ca-bundle\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.559337 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-fernet-keys\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.560730 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-config-data\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.561216 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-credential-keys\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.573013 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-scripts\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.594803 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7mfzn"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.596137 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.599346 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.599588 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-n7c24" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.600599 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.611387 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.618054 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mfzn"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.632704 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8jjd\" (UniqueName: \"kubernetes.io/projected/1319aae4-df52-49f2-8baf-3380d31994db-kube-api-access-f8jjd\") pod \"keystone-bootstrap-8qqnx\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.646167 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653177 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-combined-ca-bundle\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653266 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-config-data\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653293 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-config\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653313 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-scripts\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653326 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c3886b-35cd-47aa-aa75-6a23a593eba9-logs\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653360 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18c3886b-35cd-47aa-aa75-6a23a593eba9-horizon-secret-key\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653376 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p92b5\" (UniqueName: \"kubernetes.io/projected/18c3886b-35cd-47aa-aa75-6a23a593eba9-kube-api-access-p92b5\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.653395 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98k2l\" (UniqueName: \"kubernetes.io/projected/23a6a21f-d099-43a7-96f6-51c056d4568c-kube-api-access-98k2l\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.654660 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-config-data\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.655167 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-scripts\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.657559 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18c3886b-35cd-47aa-aa75-6a23a593eba9-horizon-secret-key\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.657799 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c3886b-35cd-47aa-aa75-6a23a593eba9-logs\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.685260 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p92b5\" (UniqueName: \"kubernetes.io/projected/18c3886b-35cd-47aa-aa75-6a23a593eba9-kube-api-access-p92b5\") pod \"horizon-d9dc4c7c7-dqrgs\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.695657 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7mk8j"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.696670 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.703913 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.703937 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.704086 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tf7vm" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.724885 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7mk8j"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.740968 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.742260 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.752467 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.752664 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sz5r7" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.752781 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.752907 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756438 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-scripts\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756485 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-combined-ca-bundle\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756538 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-config\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756586 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ppsg\" (UniqueName: \"kubernetes.io/projected/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-kube-api-access-2ppsg\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756609 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-config-data\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756631 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98k2l\" (UniqueName: \"kubernetes.io/projected/23a6a21f-d099-43a7-96f6-51c056d4568c-kube-api-access-98k2l\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756650 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-db-sync-config-data\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756703 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-etc-machine-id\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.756731 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-combined-ca-bundle\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.761433 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-config\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.762897 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.763549 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-combined-ca-bundle\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.803663 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79c5587bf7-bzj68"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.805210 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.826149 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-d568j"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.840163 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.842017 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.842326 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.853523 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.854049 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.859764 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-skknf"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.860750 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861291 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98k2l\" (UniqueName: \"kubernetes.io/projected/23a6a21f-d099-43a7-96f6-51c056d4568c-kube-api-access-98k2l\") pod \"neutron-db-sync-7mfzn\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861654 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-etc-machine-id\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861703 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861721 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-logs\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861738 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861755 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-scripts\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861773 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4j5p\" (UniqueName: \"kubernetes.io/projected/b42e0576-0579-42d0-b704-6016cf57ca7a-kube-api-access-n4j5p\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861790 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-scripts\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861807 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-logs\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861839 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-combined-ca-bundle\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861870 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861894 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861912 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-horizon-secret-key\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861951 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxr8j\" (UniqueName: \"kubernetes.io/projected/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-kube-api-access-jxr8j\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861969 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ppsg\" (UniqueName: \"kubernetes.io/projected/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-kube-api-access-2ppsg\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.861990 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-config-data\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.862009 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-config-data\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.862025 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.862042 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-db-sync-config-data\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.862070 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.862136 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-etc-machine-id\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.864661 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79c5587bf7-bzj68"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.866922 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tq92j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.867332 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.868279 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-combined-ca-bundle\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.875572 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-scripts\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.880289 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-db-sync-config-data\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.881629 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-config-data\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.891530 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-djg6t"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.892578 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.894350 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.894772 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.895708 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-km7bc" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.899951 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ppsg\" (UniqueName: \"kubernetes.io/projected/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-kube-api-access-2ppsg\") pod \"cinder-db-sync-7mk8j\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.920922 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.980764 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.981164 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-logs\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.981731 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-logs\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982604 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4j5p\" (UniqueName: \"kubernetes.io/projected/b42e0576-0579-42d0-b704-6016cf57ca7a-kube-api-access-n4j5p\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982675 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-scripts\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982701 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-logs\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982734 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-db-sync-config-data\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982785 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982837 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982859 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-horizon-secret-key\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982913 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-combined-ca-bundle\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982937 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxr8j\" (UniqueName: \"kubernetes.io/projected/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-kube-api-access-jxr8j\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.982980 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-config-data\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.983001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.983053 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.983076 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvrb\" (UniqueName: \"kubernetes.io/projected/cf601308-e467-48ee-998c-7a2ecf04d92c-kube-api-access-dzvrb\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.983485 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.984477 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-scripts\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.989876 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:12 crc kubenswrapper[4725]: I0225 11:12:12.990381 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-logs\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.003205 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.005267 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-config-data\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.013085 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4j5p\" (UniqueName: \"kubernetes.io/projected/b42e0576-0579-42d0-b704-6016cf57ca7a-kube-api-access-n4j5p\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.018748 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.018804 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.019407 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-skknf"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.026502 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-horizon-secret-key\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.026962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.035510 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxr8j\" (UniqueName: \"kubernetes.io/projected/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-kube-api-access-jxr8j\") pod \"horizon-79c5587bf7-bzj68\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.048446 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-djg6t"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.054075 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.057158 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.073420 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-djnkv"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.075352 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.084978 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2tps\" (UniqueName: \"kubernetes.io/projected/90402c1e-560a-4551-a218-91d0e04760a4-kube-api-access-l2tps\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.085018 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.085035 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.086200 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d76b9\" (UniqueName: \"kubernetes.io/projected/76768b73-31d1-407a-90e7-9583d2b3a773-kube-api-access-d76b9\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.086239 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzvrb\" (UniqueName: \"kubernetes.io/projected/cf601308-e467-48ee-998c-7a2ecf04d92c-kube-api-access-dzvrb\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.087294 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.087373 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-log-httpd\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088110 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-config-data\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088151 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-scripts\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088207 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088249 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-db-sync-config-data\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088294 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-scripts\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088418 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76768b73-31d1-407a-90e7-9583d2b3a773-logs\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088583 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mccc5\" (UniqueName: \"kubernetes.io/projected/7492d83b-6fd0-420c-99a5-19caedc41981-kube-api-access-mccc5\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088618 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-config\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088661 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-combined-ca-bundle\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088680 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088732 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-run-httpd\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088754 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-combined-ca-bundle\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088776 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.088802 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-config-data\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.095495 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.099600 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-combined-ca-bundle\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.100367 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-db-sync-config-data\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.111269 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.113015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzvrb\" (UniqueName: \"kubernetes.io/projected/cf601308-e467-48ee-998c-7a2ecf04d92c-kube-api-access-dzvrb\") pod \"barbican-db-sync-skknf\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.133860 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.139242 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.141910 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.142055 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.159809 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.164971 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-djnkv"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.195196 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.196484 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-config-data\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.196510 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-scripts\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.196538 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.196567 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-scripts\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200557 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76768b73-31d1-407a-90e7-9583d2b3a773-logs\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200595 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mccc5\" (UniqueName: \"kubernetes.io/projected/7492d83b-6fd0-420c-99a5-19caedc41981-kube-api-access-mccc5\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200632 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-config\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200669 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200714 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-run-httpd\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200732 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-combined-ca-bundle\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200758 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200778 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-config-data\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200803 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2tps\" (UniqueName: \"kubernetes.io/projected/90402c1e-560a-4551-a218-91d0e04760a4-kube-api-access-l2tps\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200840 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200857 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200878 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d76b9\" (UniqueName: \"kubernetes.io/projected/76768b73-31d1-407a-90e7-9583d2b3a773-kube-api-access-d76b9\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200902 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.200943 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-log-httpd\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.201380 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-log-httpd\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.207613 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-run-httpd\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.214532 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.223340 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-config-data\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.223475 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-scripts\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.223662 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.224142 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.224160 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.224503 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-config\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.225308 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.233359 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76768b73-31d1-407a-90e7-9583d2b3a773-logs\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.233601 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-scripts\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.234376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-config-data\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.256942 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-combined-ca-bundle\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.284595 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2tps\" (UniqueName: \"kubernetes.io/projected/90402c1e-560a-4551-a218-91d0e04760a4-kube-api-access-l2tps\") pod \"dnsmasq-dns-785d8bcb8c-djnkv\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.286538 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.289247 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d76b9\" (UniqueName: \"kubernetes.io/projected/76768b73-31d1-407a-90e7-9583d2b3a773-kube-api-access-d76b9\") pod \"placement-db-sync-djg6t\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.291799 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.302728 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.302784 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4b9\" (UniqueName: \"kubernetes.io/projected/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-kube-api-access-jb4b9\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.302963 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.303077 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.303119 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.303257 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.303311 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.303441 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.308026 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mccc5\" (UniqueName: \"kubernetes.io/projected/7492d83b-6fd0-420c-99a5-19caedc41981-kube-api-access-mccc5\") pod \"ceilometer-0\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.401148 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-d568j"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.401742 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404414 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404459 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404534 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404586 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404619 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4b9\" (UniqueName: \"kubernetes.io/projected/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-kube-api-access-jb4b9\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404680 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.404782 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.405956 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.410415 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.419804 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.420383 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.429327 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8qqnx"] Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.435073 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.452652 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.453335 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.458900 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4b9\" (UniqueName: \"kubernetes.io/projected/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-kube-api-access-jb4b9\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.491258 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.517230 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.553570 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.756005 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.798071 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d9dc4c7c7-dqrgs"] Feb 25 11:12:13 crc kubenswrapper[4725]: W0225 11:12:13.806983 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18c3886b_35cd_47aa_aa75_6a23a593eba9.slice/crio-f962810ff62c31b5bda0c48118393ebfbb7a952b8b6010eb9e8524729d939748 WatchSource:0}: Error finding container f962810ff62c31b5bda0c48118393ebfbb7a952b8b6010eb9e8524729d939748: Status 404 returned error can't find the container with id f962810ff62c31b5bda0c48118393ebfbb7a952b8b6010eb9e8524729d939748 Feb 25 11:12:13 crc kubenswrapper[4725]: I0225 11:12:13.809789 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:13.920813 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mfzn"] Feb 25 11:12:14 crc kubenswrapper[4725]: W0225 11:12:13.975019 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a6a21f_d099_43a7_96f6_51c056d4568c.slice/crio-74073ca1f5cecb21d68f29826514bf719acded05ab9db4136e8260672f185b24 WatchSource:0}: Error finding container 74073ca1f5cecb21d68f29826514bf719acded05ab9db4136e8260672f185b24: Status 404 returned error can't find the container with id 74073ca1f5cecb21d68f29826514bf719acded05ab9db4136e8260672f185b24 Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.039798 4725 generic.go:334] "Generic (PLEG): container finished" podID="962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" containerID="749da19ba2420ce2871fa1e06ba59197bede5ae0ce85d2fdcd669e569c1037cf" exitCode=0 Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.039850 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-d568j" event={"ID":"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d","Type":"ContainerDied","Data":"749da19ba2420ce2871fa1e06ba59197bede5ae0ce85d2fdcd669e569c1037cf"} Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.040077 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-d568j" event={"ID":"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d","Type":"ContainerStarted","Data":"842414bf8fa1c298f1af9601f3a340714c6cb02e04f7d21a22ef0f48b48eaacc"} Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.041882 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8qqnx" event={"ID":"1319aae4-df52-49f2-8baf-3380d31994db","Type":"ContainerStarted","Data":"3d8cbaae69be3731abeeaddc0ffa201ef89d35ac7a5041c913b8bf86d8bc2854"} Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.041934 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8qqnx" event={"ID":"1319aae4-df52-49f2-8baf-3380d31994db","Type":"ContainerStarted","Data":"97a9e162d99c964926367712d12e5b08a830010633bd51804ee634f4f085bba9"} Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.043045 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mfzn" event={"ID":"23a6a21f-d099-43a7-96f6-51c056d4568c","Type":"ContainerStarted","Data":"74073ca1f5cecb21d68f29826514bf719acded05ab9db4136e8260672f185b24"} Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.044631 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d9dc4c7c7-dqrgs" event={"ID":"18c3886b-35cd-47aa-aa75-6a23a593eba9","Type":"ContainerStarted","Data":"f962810ff62c31b5bda0c48118393ebfbb7a952b8b6010eb9e8524729d939748"} Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.079610 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8qqnx" podStartSLOduration=2.079592837 podStartE2EDuration="2.079592837s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:14.07780434 +0000 UTC m=+1159.576386385" watchObservedRunningTime="2026-02-25 11:12:14.079592837 +0000 UTC m=+1159.578174862" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.490442 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.568382 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79c5587bf7-bzj68"] Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.586346 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.611747 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6cd57d84df-rmns4"] Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.613068 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.629420 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cd57d84df-rmns4"] Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.743185 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-scripts\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.743240 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snxfx\" (UniqueName: \"kubernetes.io/projected/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-kube-api-access-snxfx\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.743283 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-config-data\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.743356 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-horizon-secret-key\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.743407 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-logs\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.787565 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.806264 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79c5587bf7-bzj68"] Feb 25 11:12:14 crc kubenswrapper[4725]: W0225 11:12:14.820343 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aebfbbc_99ac_4f7f_b7a6_e02102f97c06.slice/crio-79ced7d3b278c31cef2b3e07e40eb612b47487b57d70b1b326aa8d1aba57b41b WatchSource:0}: Error finding container 79ced7d3b278c31cef2b3e07e40eb612b47487b57d70b1b326aa8d1aba57b41b: Status 404 returned error can't find the container with id 79ced7d3b278c31cef2b3e07e40eb612b47487b57d70b1b326aa8d1aba57b41b Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.845055 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-scripts\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.845109 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snxfx\" (UniqueName: \"kubernetes.io/projected/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-kube-api-access-snxfx\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.845137 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-config-data\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.845198 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-horizon-secret-key\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.845839 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-logs\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.845992 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-scripts\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.846487 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-config-data\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.846871 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-logs\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.856379 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-horizon-secret-key\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.861509 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snxfx\" (UniqueName: \"kubernetes.io/projected/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-kube-api-access-snxfx\") pod \"horizon-6cd57d84df-rmns4\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.908250 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.947668 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:14 crc kubenswrapper[4725]: I0225 11:12:14.993492 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-skknf"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.002172 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7mk8j"] Feb 25 11:12:15 crc kubenswrapper[4725]: W0225 11:12:15.022803 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe5daf6_23bb_4480_8bd7_724dbb47ad3d.slice/crio-9c6014acdab9674f9313d281a65976fa14b19991eedc63917ece3d5dab9d691a WatchSource:0}: Error finding container 9c6014acdab9674f9313d281a65976fa14b19991eedc63917ece3d5dab9d691a: Status 404 returned error can't find the container with id 9c6014acdab9674f9313d281a65976fa14b19991eedc63917ece3d5dab9d691a Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.040038 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.059600 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mfzn" event={"ID":"23a6a21f-d099-43a7-96f6-51c056d4568c","Type":"ContainerStarted","Data":"fb2093968cb49803dda5fbd5f5111472618358218c96b8de28be014644661098"} Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.074871 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b42e0576-0579-42d0-b704-6016cf57ca7a","Type":"ContainerStarted","Data":"673db2f61e4d0807bba3e0380738e49b45c12fa0634a2df70db55e3fa7cd6b65"} Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.083051 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7mfzn" podStartSLOduration=3.083030702 podStartE2EDuration="3.083030702s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:15.081525222 +0000 UTC m=+1160.580107247" watchObservedRunningTime="2026-02-25 11:12:15.083030702 +0000 UTC m=+1160.581612727" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.086007 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7mk8j" event={"ID":"afe5daf6-23bb-4480-8bd7-724dbb47ad3d","Type":"ContainerStarted","Data":"9c6014acdab9674f9313d281a65976fa14b19991eedc63917ece3d5dab9d691a"} Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.099254 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c5587bf7-bzj68" event={"ID":"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06","Type":"ContainerStarted","Data":"79ced7d3b278c31cef2b3e07e40eb612b47487b57d70b1b326aa8d1aba57b41b"} Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.101364 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-d568j" event={"ID":"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d","Type":"ContainerDied","Data":"842414bf8fa1c298f1af9601f3a340714c6cb02e04f7d21a22ef0f48b48eaacc"} Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.101415 4725 scope.go:117] "RemoveContainer" containerID="749da19ba2420ce2871fa1e06ba59197bede5ae0ce85d2fdcd669e569c1037cf" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.101441 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-d568j" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.112993 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-skknf" event={"ID":"cf601308-e467-48ee-998c-7a2ecf04d92c","Type":"ContainerStarted","Data":"a2985c4f57060bdd63a1a81a2c8cdab408999172451a92ab8b1ebad27150b933"} Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.150098 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-config\") pod \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.150139 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-swift-storage-0\") pod \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.150233 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-sb\") pod \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.150778 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-svc\") pod \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.150818 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-nb\") pod \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.150848 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-djnkv"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.150875 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gnc2\" (UniqueName: \"kubernetes.io/projected/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-kube-api-access-7gnc2\") pod \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\" (UID: \"962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d\") " Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.164646 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.181219 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-kube-api-access-7gnc2" (OuterVolumeSpecName: "kube-api-access-7gnc2") pod "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" (UID: "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d"). InnerVolumeSpecName "kube-api-access-7gnc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.189051 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" (UID: "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.190089 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-djg6t"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.211876 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" (UID: "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.223446 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" (UID: "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.228273 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-config" (OuterVolumeSpecName: "config") pod "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" (UID: "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.246207 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" (UID: "962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.256867 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.256900 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.257047 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gnc2\" (UniqueName: \"kubernetes.io/projected/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-kube-api-access-7gnc2\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.257057 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.257066 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.257075 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.288329 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.564590 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-d568j"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.577808 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-d568j"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.586513 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cd57d84df-rmns4"] Feb 25 11:12:15 crc kubenswrapper[4725]: I0225 11:12:15.889591 4725 scope.go:117] "RemoveContainer" containerID="1b53b06edbd047e4e3ab11814740bc96fd5e59649ae99a4c964eaa08647244f8" Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.122909 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b42e0576-0579-42d0-b704-6016cf57ca7a","Type":"ContainerStarted","Data":"0c42f536f26b9542ca4630ede5b63c60a5ecf386875a3bfdebc2a960422da0fc"} Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.127580 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb4f2286-0a97-42ce-b7f2-39107be8d6bc","Type":"ContainerStarted","Data":"1435fdf8e73e6797300d7a7a6303fbbd8d1ca7f9b335c1f8b04a1c0c7d477c70"} Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.135838 4725 generic.go:334] "Generic (PLEG): container finished" podID="90402c1e-560a-4551-a218-91d0e04760a4" containerID="59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db" exitCode=0 Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.136097 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" event={"ID":"90402c1e-560a-4551-a218-91d0e04760a4","Type":"ContainerDied","Data":"59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db"} Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.136198 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" event={"ID":"90402c1e-560a-4551-a218-91d0e04760a4","Type":"ContainerStarted","Data":"26051ee387f3121d4741a2e095b3a58f55095f991b106b12b6ad26a90dde0ce0"} Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.155685 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerStarted","Data":"c21c2e771659702c09d639abb64e9910250290abffeef19b19a4278339d519d0"} Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.165394 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-djg6t" event={"ID":"76768b73-31d1-407a-90e7-9583d2b3a773","Type":"ContainerStarted","Data":"4bc48bbcc698e179c27e67436bb42169df64489e353bd9aa7ebb2337d56ae97c"} Feb 25 11:12:16 crc kubenswrapper[4725]: I0225 11:12:16.170974 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd57d84df-rmns4" event={"ID":"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1","Type":"ContainerStarted","Data":"ad4dda2eb6bc1321cd1e69038eae11c321102e1a70a5990832c5e19431fe4faf"} Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.185686 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" event={"ID":"90402c1e-560a-4551-a218-91d0e04760a4","Type":"ContainerStarted","Data":"947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959"} Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.186243 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.190188 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b42e0576-0579-42d0-b704-6016cf57ca7a","Type":"ContainerStarted","Data":"5255a3a6dfa7feb051a8afdbbe5ecfb84f0f93498b7075671c911d55e22a1684"} Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.190370 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-log" containerID="cri-o://0c42f536f26b9542ca4630ede5b63c60a5ecf386875a3bfdebc2a960422da0fc" gracePeriod=30 Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.190414 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-httpd" containerID="cri-o://5255a3a6dfa7feb051a8afdbbe5ecfb84f0f93498b7075671c911d55e22a1684" gracePeriod=30 Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.195261 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb4f2286-0a97-42ce-b7f2-39107be8d6bc","Type":"ContainerStarted","Data":"c1e2b612ea1bf2ce371b8fd8ec86f336fdff9025bb674d976c721485aa505dbb"} Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.212548 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" podStartSLOduration=5.212531875 podStartE2EDuration="5.212531875s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:17.205666754 +0000 UTC m=+1162.704248789" watchObservedRunningTime="2026-02-25 11:12:17.212531875 +0000 UTC m=+1162.711113900" Feb 25 11:12:17 crc kubenswrapper[4725]: I0225 11:12:17.238063 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" path="/var/lib/kubelet/pods/962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d/volumes" Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.223353 4725 generic.go:334] "Generic (PLEG): container finished" podID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerID="5255a3a6dfa7feb051a8afdbbe5ecfb84f0f93498b7075671c911d55e22a1684" exitCode=0 Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.224079 4725 generic.go:334] "Generic (PLEG): container finished" podID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerID="0c42f536f26b9542ca4630ede5b63c60a5ecf386875a3bfdebc2a960422da0fc" exitCode=143 Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.223518 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b42e0576-0579-42d0-b704-6016cf57ca7a","Type":"ContainerDied","Data":"5255a3a6dfa7feb051a8afdbbe5ecfb84f0f93498b7075671c911d55e22a1684"} Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.224148 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b42e0576-0579-42d0-b704-6016cf57ca7a","Type":"ContainerDied","Data":"0c42f536f26b9542ca4630ede5b63c60a5ecf386875a3bfdebc2a960422da0fc"} Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.228211 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb4f2286-0a97-42ce-b7f2-39107be8d6bc","Type":"ContainerStarted","Data":"f1d7f096c6457b9c43337ba45d7ab3a88bf8346cec3e8b1587f647a6b5aff2af"} Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.228265 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-log" containerID="cri-o://c1e2b612ea1bf2ce371b8fd8ec86f336fdff9025bb674d976c721485aa505dbb" gracePeriod=30 Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.228353 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-httpd" containerID="cri-o://f1d7f096c6457b9c43337ba45d7ab3a88bf8346cec3e8b1587f647a6b5aff2af" gracePeriod=30 Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.259766 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.259745237 podStartE2EDuration="6.259745237s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:18.246159128 +0000 UTC m=+1163.744741153" watchObservedRunningTime="2026-02-25 11:12:18.259745237 +0000 UTC m=+1163.758327272" Feb 25 11:12:18 crc kubenswrapper[4725]: I0225 11:12:18.269316 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.269298009 podStartE2EDuration="6.269298009s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:17.228091466 +0000 UTC m=+1162.726673481" watchObservedRunningTime="2026-02-25 11:12:18.269298009 +0000 UTC m=+1163.767880034" Feb 25 11:12:19 crc kubenswrapper[4725]: I0225 11:12:19.238417 4725 generic.go:334] "Generic (PLEG): container finished" podID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerID="f1d7f096c6457b9c43337ba45d7ab3a88bf8346cec3e8b1587f647a6b5aff2af" exitCode=0 Feb 25 11:12:19 crc kubenswrapper[4725]: I0225 11:12:19.238444 4725 generic.go:334] "Generic (PLEG): container finished" podID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerID="c1e2b612ea1bf2ce371b8fd8ec86f336fdff9025bb674d976c721485aa505dbb" exitCode=143 Feb 25 11:12:19 crc kubenswrapper[4725]: I0225 11:12:19.238473 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb4f2286-0a97-42ce-b7f2-39107be8d6bc","Type":"ContainerDied","Data":"f1d7f096c6457b9c43337ba45d7ab3a88bf8346cec3e8b1587f647a6b5aff2af"} Feb 25 11:12:19 crc kubenswrapper[4725]: I0225 11:12:19.238496 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb4f2286-0a97-42ce-b7f2-39107be8d6bc","Type":"ContainerDied","Data":"c1e2b612ea1bf2ce371b8fd8ec86f336fdff9025bb674d976c721485aa505dbb"} Feb 25 11:12:19 crc kubenswrapper[4725]: I0225 11:12:19.241330 4725 generic.go:334] "Generic (PLEG): container finished" podID="1319aae4-df52-49f2-8baf-3380d31994db" containerID="3d8cbaae69be3731abeeaddc0ffa201ef89d35ac7a5041c913b8bf86d8bc2854" exitCode=0 Feb 25 11:12:19 crc kubenswrapper[4725]: I0225 11:12:19.241353 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8qqnx" event={"ID":"1319aae4-df52-49f2-8baf-3380d31994db","Type":"ContainerDied","Data":"3d8cbaae69be3731abeeaddc0ffa201ef89d35ac7a5041c913b8bf86d8bc2854"} Feb 25 11:12:20 crc kubenswrapper[4725]: I0225 11:12:20.968732 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d9dc4c7c7-dqrgs"] Feb 25 11:12:20 crc kubenswrapper[4725]: I0225 11:12:20.981358 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64cd88bfbd-zxddf"] Feb 25 11:12:20 crc kubenswrapper[4725]: E0225 11:12:20.981707 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" containerName="init" Feb 25 11:12:20 crc kubenswrapper[4725]: I0225 11:12:20.981723 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" containerName="init" Feb 25 11:12:20 crc kubenswrapper[4725]: I0225 11:12:20.982124 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="962dfbb0-8d31-4abc-95b7-dd6c45ca3f5d" containerName="init" Feb 25 11:12:20 crc kubenswrapper[4725]: I0225 11:12:20.982919 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:20 crc kubenswrapper[4725]: I0225 11:12:20.997541 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 25 11:12:20 crc kubenswrapper[4725]: I0225 11:12:20.999105 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64cd88bfbd-zxddf"] Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.042084 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cd57d84df-rmns4"] Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.058962 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-config-data\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.059037 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fhtc\" (UniqueName: \"kubernetes.io/projected/abad9fb0-482e-4ed1-8bf5-e738ee946358-kube-api-access-9fhtc\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.059068 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-secret-key\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.059090 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-scripts\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.059105 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-tls-certs\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.059120 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abad9fb0-482e-4ed1-8bf5-e738ee946358-logs\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.059394 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-combined-ca-bundle\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.077644 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cbf649584-gsrdx"] Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.079107 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.138700 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cbf649584-gsrdx"] Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.160755 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-secret-key\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.160898 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdbp\" (UniqueName: \"kubernetes.io/projected/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-kube-api-access-rcdbp\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.160944 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-scripts\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.160976 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-scripts\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161000 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-tls-certs\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161024 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abad9fb0-482e-4ed1-8bf5-e738ee946358-logs\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161065 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-config-data\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161086 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-logs\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161156 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-combined-ca-bundle\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161204 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-combined-ca-bundle\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161246 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-config-data\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-horizon-secret-key\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161323 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-horizon-tls-certs\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161361 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fhtc\" (UniqueName: \"kubernetes.io/projected/abad9fb0-482e-4ed1-8bf5-e738ee946358-kube-api-access-9fhtc\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.161817 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-scripts\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.163770 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-config-data\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.164086 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abad9fb0-482e-4ed1-8bf5-e738ee946358-logs\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.166617 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-combined-ca-bundle\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.166902 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-tls-certs\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.188797 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fhtc\" (UniqueName: \"kubernetes.io/projected/abad9fb0-482e-4ed1-8bf5-e738ee946358-kube-api-access-9fhtc\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.194032 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-secret-key\") pod \"horizon-64cd88bfbd-zxddf\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.262554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-combined-ca-bundle\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.262610 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-horizon-secret-key\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.262679 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-horizon-tls-certs\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.262731 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdbp\" (UniqueName: \"kubernetes.io/projected/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-kube-api-access-rcdbp\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.262752 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-scripts\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.262795 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-config-data\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.262819 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-logs\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.263550 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-logs\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.264725 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-scripts\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.264967 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-config-data\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.266439 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-combined-ca-bundle\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.266916 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-horizon-tls-certs\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.267822 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-horizon-secret-key\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.285334 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdbp\" (UniqueName: \"kubernetes.io/projected/f017ec2d-5d1b-405c-b2f7-b3212e3696d7-kube-api-access-rcdbp\") pod \"horizon-7cbf649584-gsrdx\" (UID: \"f017ec2d-5d1b-405c-b2f7-b3212e3696d7\") " pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.319313 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:21 crc kubenswrapper[4725]: I0225 11:12:21.404846 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.847907 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.893290 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-scripts\") pod \"1319aae4-df52-49f2-8baf-3380d31994db\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.893708 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-combined-ca-bundle\") pod \"1319aae4-df52-49f2-8baf-3380d31994db\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.893745 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-config-data\") pod \"1319aae4-df52-49f2-8baf-3380d31994db\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.893802 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8jjd\" (UniqueName: \"kubernetes.io/projected/1319aae4-df52-49f2-8baf-3380d31994db-kube-api-access-f8jjd\") pod \"1319aae4-df52-49f2-8baf-3380d31994db\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.894120 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-credential-keys\") pod \"1319aae4-df52-49f2-8baf-3380d31994db\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.894169 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-fernet-keys\") pod \"1319aae4-df52-49f2-8baf-3380d31994db\" (UID: \"1319aae4-df52-49f2-8baf-3380d31994db\") " Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.903034 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1319aae4-df52-49f2-8baf-3380d31994db" (UID: "1319aae4-df52-49f2-8baf-3380d31994db"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.903071 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-scripts" (OuterVolumeSpecName: "scripts") pod "1319aae4-df52-49f2-8baf-3380d31994db" (UID: "1319aae4-df52-49f2-8baf-3380d31994db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.903122 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1319aae4-df52-49f2-8baf-3380d31994db" (UID: "1319aae4-df52-49f2-8baf-3380d31994db"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.903598 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1319aae4-df52-49f2-8baf-3380d31994db-kube-api-access-f8jjd" (OuterVolumeSpecName: "kube-api-access-f8jjd") pod "1319aae4-df52-49f2-8baf-3380d31994db" (UID: "1319aae4-df52-49f2-8baf-3380d31994db"). InnerVolumeSpecName "kube-api-access-f8jjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.927079 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1319aae4-df52-49f2-8baf-3380d31994db" (UID: "1319aae4-df52-49f2-8baf-3380d31994db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.944173 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-config-data" (OuterVolumeSpecName: "config-data") pod "1319aae4-df52-49f2-8baf-3380d31994db" (UID: "1319aae4-df52-49f2-8baf-3380d31994db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.996715 4725 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.996754 4725 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.996765 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.996778 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.996790 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1319aae4-df52-49f2-8baf-3380d31994db-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:22 crc kubenswrapper[4725]: I0225 11:12:22.996803 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8jjd\" (UniqueName: \"kubernetes.io/projected/1319aae4-df52-49f2-8baf-3380d31994db-kube-api-access-f8jjd\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.285446 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8qqnx" event={"ID":"1319aae4-df52-49f2-8baf-3380d31994db","Type":"ContainerDied","Data":"97a9e162d99c964926367712d12e5b08a830010633bd51804ee634f4f085bba9"} Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.285511 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a9e162d99c964926367712d12e5b08a830010633bd51804ee634f4f085bba9" Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.285544 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8qqnx" Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.405139 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.497021 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mrq7b"] Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.497416 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="dnsmasq-dns" containerID="cri-o://742c379fd1cce6dd119e7b9543d95d51a4735d72b9453b172fd15b8c71a2bb4f" gracePeriod=10 Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.941880 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8qqnx"] Feb 25 11:12:23 crc kubenswrapper[4725]: I0225 11:12:23.952394 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8qqnx"] Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.044894 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fzl9q"] Feb 25 11:12:24 crc kubenswrapper[4725]: E0225 11:12:24.045471 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1319aae4-df52-49f2-8baf-3380d31994db" containerName="keystone-bootstrap" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.045487 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1319aae4-df52-49f2-8baf-3380d31994db" containerName="keystone-bootstrap" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.045667 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1319aae4-df52-49f2-8baf-3380d31994db" containerName="keystone-bootstrap" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.046199 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.048350 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bt58t" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.049358 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.052077 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.052353 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.052477 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.059909 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fzl9q"] Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.121655 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-credential-keys\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.121714 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-combined-ca-bundle\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.121786 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-scripts\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.121909 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-config-data\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.122110 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v85vn\" (UniqueName: \"kubernetes.io/projected/cc96366a-6045-408e-9be6-07abc53c1b3e-kube-api-access-v85vn\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.122333 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-fernet-keys\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.224257 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v85vn\" (UniqueName: \"kubernetes.io/projected/cc96366a-6045-408e-9be6-07abc53c1b3e-kube-api-access-v85vn\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.224460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-fernet-keys\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.224595 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-credential-keys\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.224643 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-combined-ca-bundle\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.224724 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-scripts\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.224823 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-config-data\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.233175 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-combined-ca-bundle\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.233461 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-scripts\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.233594 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-fernet-keys\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.235697 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-config-data\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.235795 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-credential-keys\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.256252 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v85vn\" (UniqueName: \"kubernetes.io/projected/cc96366a-6045-408e-9be6-07abc53c1b3e-kube-api-access-v85vn\") pod \"keystone-bootstrap-fzl9q\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.301666 4725 generic.go:334] "Generic (PLEG): container finished" podID="652ed68d-108a-459a-8493-bb798b194940" containerID="742c379fd1cce6dd119e7b9543d95d51a4735d72b9453b172fd15b8c71a2bb4f" exitCode=0 Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.301772 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" event={"ID":"652ed68d-108a-459a-8493-bb798b194940","Type":"ContainerDied","Data":"742c379fd1cce6dd119e7b9543d95d51a4735d72b9453b172fd15b8c71a2bb4f"} Feb 25 11:12:24 crc kubenswrapper[4725]: I0225 11:12:24.365532 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:25 crc kubenswrapper[4725]: I0225 11:12:25.241349 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1319aae4-df52-49f2-8baf-3380d31994db" path="/var/lib/kubelet/pods/1319aae4-df52-49f2-8baf-3380d31994db/volumes" Feb 25 11:12:26 crc kubenswrapper[4725]: I0225 11:12:26.415063 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.812761 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.813123 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6dh57fhfbh647h577h98h8bh68fhch67h8dhc4h5bbh54bh5bhcdh7ch8dhcbhc8h67dh648h68dh5cfh54fh56dh74h59ch677h548h694h55dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p92b5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-d9dc4c7c7-dqrgs_openstack(18c3886b-35cd-47aa-aa75-6a23a593eba9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.821167 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-d9dc4c7c7-dqrgs" podUID="18c3886b-35cd-47aa-aa75-6a23a593eba9" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.846286 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.846398 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc5h5c5h578h658h6ch8h5f8h64h7bh7fh8bh5bh98h66h55bh59fhbch7dh577h696h599h56h545h98h5c5h57h697h575h57h68bh66dhc9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxr8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-79c5587bf7-bzj68_openstack(0aebfbbc-99ac-4f7f-b7a6-e02102f97c06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.848249 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-79c5587bf7-bzj68" podUID="0aebfbbc-99ac-4f7f-b7a6-e02102f97c06" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.862116 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.862238 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6dh66h6fh8bh567h99h56ch5cbh685h75h66dh676h646h597h88h66fh595h7fh85hd5h658h684h685h656h546h59ch5cdh585h59dh557h659h579q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snxfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6cd57d84df-rmns4_openstack(2bc15a3e-6ed8-4cab-8f6f-32a1766260b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:12:30 crc kubenswrapper[4725]: E0225 11:12:30.864107 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6cd57d84df-rmns4" podUID="2bc15a3e-6ed8-4cab-8f6f-32a1766260b1" Feb 25 11:12:30 crc kubenswrapper[4725]: I0225 11:12:30.943587 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.048219 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-scripts\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.048313 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-combined-ca-bundle\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.048343 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-httpd-run\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.048358 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-public-tls-certs\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.048380 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-config-data\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.049010 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.049097 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.049358 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-logs\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.049389 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4j5p\" (UniqueName: \"kubernetes.io/projected/b42e0576-0579-42d0-b704-6016cf57ca7a-kube-api-access-n4j5p\") pod \"b42e0576-0579-42d0-b704-6016cf57ca7a\" (UID: \"b42e0576-0579-42d0-b704-6016cf57ca7a\") " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.049713 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.050000 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-logs" (OuterVolumeSpecName: "logs") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.066217 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.066277 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42e0576-0579-42d0-b704-6016cf57ca7a-kube-api-access-n4j5p" (OuterVolumeSpecName: "kube-api-access-n4j5p") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "kube-api-access-n4j5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.071997 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-scripts" (OuterVolumeSpecName: "scripts") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.097987 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.108488 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.130676 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-config-data" (OuterVolumeSpecName: "config-data") pod "b42e0576-0579-42d0-b704-6016cf57ca7a" (UID: "b42e0576-0579-42d0-b704-6016cf57ca7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.151369 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.151399 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.151408 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.151443 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.151453 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b42e0576-0579-42d0-b704-6016cf57ca7a-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.151463 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4j5p\" (UniqueName: \"kubernetes.io/projected/b42e0576-0579-42d0-b704-6016cf57ca7a-kube-api-access-n4j5p\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.151473 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42e0576-0579-42d0-b704-6016cf57ca7a-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.166864 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.253565 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.370106 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.370119 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b42e0576-0579-42d0-b704-6016cf57ca7a","Type":"ContainerDied","Data":"673db2f61e4d0807bba3e0380738e49b45c12fa0634a2df70db55e3fa7cd6b65"} Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.370188 4725 scope.go:117] "RemoveContainer" containerID="5255a3a6dfa7feb051a8afdbbe5ecfb84f0f93498b7075671c911d55e22a1684" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.414183 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.454727 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.487037 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.519903 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:31 crc kubenswrapper[4725]: E0225 11:12:31.520629 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-httpd" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.520649 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-httpd" Feb 25 11:12:31 crc kubenswrapper[4725]: E0225 11:12:31.520656 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-log" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.520664 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-log" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.521058 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-httpd" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.521094 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" containerName="glance-log" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.524542 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.527931 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.528240 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.562664 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.569994 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.570420 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ntwj\" (UniqueName: \"kubernetes.io/projected/bf04584d-e28f-4010-91c0-0dafe5dde54c-kube-api-access-8ntwj\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.570490 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-logs\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.574226 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.574615 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.574690 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.574925 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.588231 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:31 crc kubenswrapper[4725]: E0225 11:12:31.590323 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 25 11:12:31 crc kubenswrapper[4725]: E0225 11:12:31.590780 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dzvrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-skknf_openstack(cf601308-e467-48ee-998c-7a2ecf04d92c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:12:31 crc kubenswrapper[4725]: E0225 11:12:31.592673 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-skknf" podUID="cf601308-e467-48ee-998c-7a2ecf04d92c" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677165 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677482 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677508 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677552 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntwj\" (UniqueName: \"kubernetes.io/projected/bf04584d-e28f-4010-91c0-0dafe5dde54c-kube-api-access-8ntwj\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677576 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-logs\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677595 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677629 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.677647 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.678962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.679630 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-logs\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.679746 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.682799 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.686837 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.693053 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.696013 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntwj\" (UniqueName: \"kubernetes.io/projected/bf04584d-e28f-4010-91c0-0dafe5dde54c-kube-api-access-8ntwj\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.696416 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.707239 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " pod="openstack/glance-default-external-api-0" Feb 25 11:12:31 crc kubenswrapper[4725]: I0225 11:12:31.876358 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:12:32 crc kubenswrapper[4725]: E0225 11:12:32.080818 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 25 11:12:32 crc kubenswrapper[4725]: E0225 11:12:32.081019 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d6h565h87h54h68h9h544h55dh5c9h5dbh4hcbh7bh59h659h5fdh597h5b7h68dh66ch84h96h645h687h667hd4h544h54dh94h599h5d9h5dbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mccc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7492d83b-6fd0-420c-99a5-19caedc41981): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:12:32 crc kubenswrapper[4725]: E0225 11:12:32.378330 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-skknf" podUID="cf601308-e467-48ee-998c-7a2ecf04d92c" Feb 25 11:12:33 crc kubenswrapper[4725]: I0225 11:12:33.237457 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b42e0576-0579-42d0-b704-6016cf57ca7a" path="/var/lib/kubelet/pods/b42e0576-0579-42d0-b704-6016cf57ca7a/volumes" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.444763 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" event={"ID":"652ed68d-108a-459a-8493-bb798b194940","Type":"ContainerDied","Data":"85294568e5a9d2e6566847bbbad31a89bc3513f9915dac0d6168ba334c2f49b9"} Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.445597 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85294568e5a9d2e6566847bbbad31a89bc3513f9915dac0d6168ba334c2f49b9" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.446480 4725 generic.go:334] "Generic (PLEG): container finished" podID="23a6a21f-d099-43a7-96f6-51c056d4568c" containerID="fb2093968cb49803dda5fbd5f5111472618358218c96b8de28be014644661098" exitCode=0 Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.446566 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mfzn" event={"ID":"23a6a21f-d099-43a7-96f6-51c056d4568c","Type":"ContainerDied","Data":"fb2093968cb49803dda5fbd5f5111472618358218c96b8de28be014644661098"} Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.448332 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d9dc4c7c7-dqrgs" event={"ID":"18c3886b-35cd-47aa-aa75-6a23a593eba9","Type":"ContainerDied","Data":"f962810ff62c31b5bda0c48118393ebfbb7a952b8b6010eb9e8524729d939748"} Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.448355 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f962810ff62c31b5bda0c48118393ebfbb7a952b8b6010eb9e8524729d939748" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.450261 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c5587bf7-bzj68" event={"ID":"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06","Type":"ContainerDied","Data":"79ced7d3b278c31cef2b3e07e40eb612b47487b57d70b1b326aa8d1aba57b41b"} Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.450293 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ced7d3b278c31cef2b3e07e40eb612b47487b57d70b1b326aa8d1aba57b41b" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.451648 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cd57d84df-rmns4" event={"ID":"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1","Type":"ContainerDied","Data":"ad4dda2eb6bc1321cd1e69038eae11c321102e1a70a5990832c5e19431fe4faf"} Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.451674 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4dda2eb6bc1321cd1e69038eae11c321102e1a70a5990832c5e19431fe4faf" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.453687 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb4f2286-0a97-42ce-b7f2-39107be8d6bc","Type":"ContainerDied","Data":"1435fdf8e73e6797300d7a7a6303fbbd8d1ca7f9b335c1f8b04a1c0c7d477c70"} Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.453711 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1435fdf8e73e6797300d7a7a6303fbbd8d1ca7f9b335c1f8b04a1c0c7d477c70" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.554492 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.562546 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.566775 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.577634 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.585948 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742330 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-config\") pod \"652ed68d-108a-459a-8493-bb798b194940\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742389 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-svc\") pod \"652ed68d-108a-459a-8493-bb798b194940\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742430 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-config-data\") pod \"18c3886b-35cd-47aa-aa75-6a23a593eba9\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742811 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742871 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-logs\") pod \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742900 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p92b5\" (UniqueName: \"kubernetes.io/projected/18c3886b-35cd-47aa-aa75-6a23a593eba9-kube-api-access-p92b5\") pod \"18c3886b-35cd-47aa-aa75-6a23a593eba9\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742925 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-config-data\") pod \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.742958 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nd2k\" (UniqueName: \"kubernetes.io/projected/652ed68d-108a-459a-8493-bb798b194940-kube-api-access-8nd2k\") pod \"652ed68d-108a-459a-8493-bb798b194940\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743015 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-internal-tls-certs\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743045 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-swift-storage-0\") pod \"652ed68d-108a-459a-8493-bb798b194940\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743391 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-sb\") pod \"652ed68d-108a-459a-8493-bb798b194940\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743451 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snxfx\" (UniqueName: \"kubernetes.io/projected/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-kube-api-access-snxfx\") pod \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743489 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-scripts\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743514 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb4b9\" (UniqueName: \"kubernetes.io/projected/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-kube-api-access-jb4b9\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743544 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-config-data\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743576 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-nb\") pod \"652ed68d-108a-459a-8493-bb798b194940\" (UID: \"652ed68d-108a-459a-8493-bb798b194940\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743605 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-horizon-secret-key\") pod \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743631 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-httpd-run\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743650 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-combined-ca-bundle\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.743982 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-logs\") pod \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744045 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c3886b-35cd-47aa-aa75-6a23a593eba9-logs\") pod \"18c3886b-35cd-47aa-aa75-6a23a593eba9\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744089 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-horizon-secret-key\") pod \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744110 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-scripts\") pod \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744127 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-logs\") pod \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\" (UID: \"cb4f2286-0a97-42ce-b7f2-39107be8d6bc\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744145 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxr8j\" (UniqueName: \"kubernetes.io/projected/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-kube-api-access-jxr8j\") pod \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\" (UID: \"0aebfbbc-99ac-4f7f-b7a6-e02102f97c06\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744173 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18c3886b-35cd-47aa-aa75-6a23a593eba9-horizon-secret-key\") pod \"18c3886b-35cd-47aa-aa75-6a23a593eba9\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744229 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-scripts\") pod \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744249 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-scripts\") pod \"18c3886b-35cd-47aa-aa75-6a23a593eba9\" (UID: \"18c3886b-35cd-47aa-aa75-6a23a593eba9\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744275 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-config-data\") pod \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\" (UID: \"2bc15a3e-6ed8-4cab-8f6f-32a1766260b1\") " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.744984 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-config-data" (OuterVolumeSpecName: "config-data") pod "18c3886b-35cd-47aa-aa75-6a23a593eba9" (UID: "18c3886b-35cd-47aa-aa75-6a23a593eba9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.745474 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-config-data" (OuterVolumeSpecName: "config-data") pod "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1" (UID: "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.745913 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-logs" (OuterVolumeSpecName: "logs") pod "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1" (UID: "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.746098 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c3886b-35cd-47aa-aa75-6a23a593eba9-logs" (OuterVolumeSpecName: "logs") pod "18c3886b-35cd-47aa-aa75-6a23a593eba9" (UID: "18c3886b-35cd-47aa-aa75-6a23a593eba9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.748382 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.748711 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.750939 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1" (UID: "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.751440 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-scripts" (OuterVolumeSpecName: "scripts") pod "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06" (UID: "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.751678 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-logs" (OuterVolumeSpecName: "logs") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.751783 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-kube-api-access-jb4b9" (OuterVolumeSpecName: "kube-api-access-jb4b9") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "kube-api-access-jb4b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.751907 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06" (UID: "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.753099 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-kube-api-access-snxfx" (OuterVolumeSpecName: "kube-api-access-snxfx") pod "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1" (UID: "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1"). InnerVolumeSpecName "kube-api-access-snxfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.754194 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-logs" (OuterVolumeSpecName: "logs") pod "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06" (UID: "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.755009 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-scripts" (OuterVolumeSpecName: "scripts") pod "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1" (UID: "2bc15a3e-6ed8-4cab-8f6f-32a1766260b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.755054 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-kube-api-access-jxr8j" (OuterVolumeSpecName: "kube-api-access-jxr8j") pod "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06" (UID: "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06"). InnerVolumeSpecName "kube-api-access-jxr8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.755222 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c3886b-35cd-47aa-aa75-6a23a593eba9-kube-api-access-p92b5" (OuterVolumeSpecName: "kube-api-access-p92b5") pod "18c3886b-35cd-47aa-aa75-6a23a593eba9" (UID: "18c3886b-35cd-47aa-aa75-6a23a593eba9"). InnerVolumeSpecName "kube-api-access-p92b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.755339 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-scripts" (OuterVolumeSpecName: "scripts") pod "18c3886b-35cd-47aa-aa75-6a23a593eba9" (UID: "18c3886b-35cd-47aa-aa75-6a23a593eba9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.755770 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-config-data" (OuterVolumeSpecName: "config-data") pod "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06" (UID: "0aebfbbc-99ac-4f7f-b7a6-e02102f97c06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.757752 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c3886b-35cd-47aa-aa75-6a23a593eba9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "18c3886b-35cd-47aa-aa75-6a23a593eba9" (UID: "18c3886b-35cd-47aa-aa75-6a23a593eba9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.758974 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-scripts" (OuterVolumeSpecName: "scripts") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.759305 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652ed68d-108a-459a-8493-bb798b194940-kube-api-access-8nd2k" (OuterVolumeSpecName: "kube-api-access-8nd2k") pod "652ed68d-108a-459a-8493-bb798b194940" (UID: "652ed68d-108a-459a-8493-bb798b194940"). InnerVolumeSpecName "kube-api-access-8nd2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.792126 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "652ed68d-108a-459a-8493-bb798b194940" (UID: "652ed68d-108a-459a-8493-bb798b194940"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.794338 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-config" (OuterVolumeSpecName: "config") pod "652ed68d-108a-459a-8493-bb798b194940" (UID: "652ed68d-108a-459a-8493-bb798b194940"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.797903 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.810463 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-config-data" (OuterVolumeSpecName: "config-data") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.812117 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb4f2286-0a97-42ce-b7f2-39107be8d6bc" (UID: "cb4f2286-0a97-42ce-b7f2-39107be8d6bc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.814518 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "652ed68d-108a-459a-8493-bb798b194940" (UID: "652ed68d-108a-459a-8493-bb798b194940"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.819664 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "652ed68d-108a-459a-8493-bb798b194940" (UID: "652ed68d-108a-459a-8493-bb798b194940"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.824647 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "652ed68d-108a-459a-8493-bb798b194940" (UID: "652ed68d-108a-459a-8493-bb798b194940"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847701 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847738 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847753 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847767 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847779 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847790 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847801 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c3886b-35cd-47aa-aa75-6a23a593eba9-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847814 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847838 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847849 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847862 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxr8j\" (UniqueName: \"kubernetes.io/projected/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-kube-api-access-jxr8j\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847874 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18c3886b-35cd-47aa-aa75-6a23a593eba9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847885 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847895 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847906 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847917 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847929 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847941 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c3886b-35cd-47aa-aa75-6a23a593eba9-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847979 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.847992 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848003 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p92b5\" (UniqueName: \"kubernetes.io/projected/18c3886b-35cd-47aa-aa75-6a23a593eba9-kube-api-access-p92b5\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848016 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848027 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nd2k\" (UniqueName: \"kubernetes.io/projected/652ed68d-108a-459a-8493-bb798b194940-kube-api-access-8nd2k\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848039 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848049 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848060 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/652ed68d-108a-459a-8493-bb798b194940-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848071 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snxfx\" (UniqueName: \"kubernetes.io/projected/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1-kube-api-access-snxfx\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848082 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.848094 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb4b9\" (UniqueName: \"kubernetes.io/projected/cb4f2286-0a97-42ce-b7f2-39107be8d6bc-kube-api-access-jb4b9\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.876448 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 25 11:12:39 crc kubenswrapper[4725]: I0225 11:12:39.949796 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.461652 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cd57d84df-rmns4" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.461733 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.461872 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d9dc4c7c7-dqrgs" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.461924 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.461961 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c5587bf7-bzj68" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.523685 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.552605 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.571961 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mrq7b"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.579854 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mrq7b"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.586536 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:40 crc kubenswrapper[4725]: E0225 11:12:40.586945 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="dnsmasq-dns" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.586958 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="dnsmasq-dns" Feb 25 11:12:40 crc kubenswrapper[4725]: E0225 11:12:40.586973 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-httpd" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.586979 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-httpd" Feb 25 11:12:40 crc kubenswrapper[4725]: E0225 11:12:40.586987 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="init" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.586994 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="init" Feb 25 11:12:40 crc kubenswrapper[4725]: E0225 11:12:40.587005 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-log" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.587013 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-log" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.587160 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="dnsmasq-dns" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.587184 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-httpd" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.587195 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" containerName="glance-log" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.588189 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.590462 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.590927 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.611204 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79c5587bf7-bzj68"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.624579 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79c5587bf7-bzj68"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.634666 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.670098 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cd57d84df-rmns4"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.678976 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6cd57d84df-rmns4"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.698613 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d9dc4c7c7-dqrgs"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.706585 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-d9dc4c7c7-dqrgs"] Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.764549 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.764592 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.764618 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.764644 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.764726 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.764900 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-logs\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.765066 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brqpt\" (UniqueName: \"kubernetes.io/projected/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-kube-api-access-brqpt\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.765093 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.866406 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.866649 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-logs\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.866755 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brqpt\" (UniqueName: \"kubernetes.io/projected/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-kube-api-access-brqpt\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.866843 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.866949 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.867012 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.867100 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.867164 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.867505 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-logs\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.867523 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.868723 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.872379 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.872564 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.873431 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.874370 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.886380 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brqpt\" (UniqueName: \"kubernetes.io/projected/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-kube-api-access-brqpt\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.897236 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:12:40 crc kubenswrapper[4725]: I0225 11:12:40.921167 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:41 crc kubenswrapper[4725]: E0225 11:12:41.094190 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 25 11:12:41 crc kubenswrapper[4725]: E0225 11:12:41.094432 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2ppsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-7mk8j_openstack(afe5daf6-23bb-4480-8bd7-724dbb47ad3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:12:41 crc kubenswrapper[4725]: E0225 11:12:41.096068 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-7mk8j" podUID="afe5daf6-23bb-4480-8bd7-724dbb47ad3d" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.099110 4725 scope.go:117] "RemoveContainer" containerID="0c42f536f26b9542ca4630ede5b63c60a5ecf386875a3bfdebc2a960422da0fc" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.154347 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.236289 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aebfbbc-99ac-4f7f-b7a6-e02102f97c06" path="/var/lib/kubelet/pods/0aebfbbc-99ac-4f7f-b7a6-e02102f97c06/volumes" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.236863 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c3886b-35cd-47aa-aa75-6a23a593eba9" path="/var/lib/kubelet/pods/18c3886b-35cd-47aa-aa75-6a23a593eba9/volumes" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.237408 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc15a3e-6ed8-4cab-8f6f-32a1766260b1" path="/var/lib/kubelet/pods/2bc15a3e-6ed8-4cab-8f6f-32a1766260b1/volumes" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.237872 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652ed68d-108a-459a-8493-bb798b194940" path="/var/lib/kubelet/pods/652ed68d-108a-459a-8493-bb798b194940/volumes" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.239198 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4f2286-0a97-42ce-b7f2-39107be8d6bc" path="/var/lib/kubelet/pods/cb4f2286-0a97-42ce-b7f2-39107be8d6bc/volumes" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.274713 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98k2l\" (UniqueName: \"kubernetes.io/projected/23a6a21f-d099-43a7-96f6-51c056d4568c-kube-api-access-98k2l\") pod \"23a6a21f-d099-43a7-96f6-51c056d4568c\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.274810 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-config\") pod \"23a6a21f-d099-43a7-96f6-51c056d4568c\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.275068 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-combined-ca-bundle\") pod \"23a6a21f-d099-43a7-96f6-51c056d4568c\" (UID: \"23a6a21f-d099-43a7-96f6-51c056d4568c\") " Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.279135 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a6a21f-d099-43a7-96f6-51c056d4568c-kube-api-access-98k2l" (OuterVolumeSpecName: "kube-api-access-98k2l") pod "23a6a21f-d099-43a7-96f6-51c056d4568c" (UID: "23a6a21f-d099-43a7-96f6-51c056d4568c"). InnerVolumeSpecName "kube-api-access-98k2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.302550 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23a6a21f-d099-43a7-96f6-51c056d4568c" (UID: "23a6a21f-d099-43a7-96f6-51c056d4568c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.303956 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-config" (OuterVolumeSpecName: "config") pod "23a6a21f-d099-43a7-96f6-51c056d4568c" (UID: "23a6a21f-d099-43a7-96f6-51c056d4568c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.376859 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.376895 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98k2l\" (UniqueName: \"kubernetes.io/projected/23a6a21f-d099-43a7-96f6-51c056d4568c-kube-api-access-98k2l\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.376907 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/23a6a21f-d099-43a7-96f6-51c056d4568c-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.419894 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-mrq7b" podUID="652ed68d-108a-459a-8493-bb798b194940" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.505754 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mfzn" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.506789 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mfzn" event={"ID":"23a6a21f-d099-43a7-96f6-51c056d4568c","Type":"ContainerDied","Data":"74073ca1f5cecb21d68f29826514bf719acded05ab9db4136e8260672f185b24"} Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.506821 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74073ca1f5cecb21d68f29826514bf719acded05ab9db4136e8260672f185b24" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.557762 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.558011 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:12:41 crc kubenswrapper[4725]: E0225 11:12:41.565245 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-7mk8j" podUID="afe5daf6-23bb-4480-8bd7-724dbb47ad3d" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.705168 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-66qfw"] Feb 25 11:12:41 crc kubenswrapper[4725]: E0225 11:12:41.705596 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a6a21f-d099-43a7-96f6-51c056d4568c" containerName="neutron-db-sync" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.705618 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a6a21f-d099-43a7-96f6-51c056d4568c" containerName="neutron-db-sync" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.705843 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a6a21f-d099-43a7-96f6-51c056d4568c" containerName="neutron-db-sync" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.711549 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.722748 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-66qfw"] Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.793224 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdvg\" (UniqueName: \"kubernetes.io/projected/8bcf915d-87e3-4faf-8875-adeb9f0146af-kube-api-access-7kdvg\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.793523 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-config\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.793559 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.793581 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.793609 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.793626 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-svc\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.800383 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b9448d47d-2x4vh"] Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.801953 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.809665 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.812008 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-n7c24" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.812613 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.812774 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.841507 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b9448d47d-2x4vh"] Feb 25 11:12:41 crc kubenswrapper[4725]: W0225 11:12:41.850073 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabad9fb0_482e_4ed1_8bf5_e738ee946358.slice/crio-c85eaadb04a5015a0af6f5b45d0cef53dbdbaf4bd7e9981c04a19c8854b66d58 WatchSource:0}: Error finding container c85eaadb04a5015a0af6f5b45d0cef53dbdbaf4bd7e9981c04a19c8854b66d58: Status 404 returned error can't find the container with id c85eaadb04a5015a0af6f5b45d0cef53dbdbaf4bd7e9981c04a19c8854b66d58 Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.864081 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64cd88bfbd-zxddf"] Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.896677 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-config\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.896749 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.896773 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.896802 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.896816 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-svc\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.896995 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdvg\" (UniqueName: \"kubernetes.io/projected/8bcf915d-87e3-4faf-8875-adeb9f0146af-kube-api-access-7kdvg\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.898515 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.898481 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.898723 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.898867 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-svc\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.900532 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-config\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.922022 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdvg\" (UniqueName: \"kubernetes.io/projected/8bcf915d-87e3-4faf-8875-adeb9f0146af-kube-api-access-7kdvg\") pod \"dnsmasq-dns-55f844cf75-66qfw\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.998283 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-config\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.998337 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22ml\" (UniqueName: \"kubernetes.io/projected/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-kube-api-access-l22ml\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.998413 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-ovndb-tls-certs\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.998431 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-httpd-config\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:41 crc kubenswrapper[4725]: I0225 11:12:41.998459 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-combined-ca-bundle\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.000939 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cbf649584-gsrdx"] Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.060650 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.116856 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-ovndb-tls-certs\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.117106 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-httpd-config\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.117133 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-combined-ca-bundle\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.117184 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-config\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.117220 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22ml\" (UniqueName: \"kubernetes.io/projected/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-kube-api-access-l22ml\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.130888 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-httpd-config\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.131031 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-combined-ca-bundle\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.131593 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-config\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.137968 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.139768 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-ovndb-tls-certs\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.144666 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22ml\" (UniqueName: \"kubernetes.io/projected/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-kube-api-access-l22ml\") pod \"neutron-7b9448d47d-2x4vh\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.164978 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fzl9q"] Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.311366 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.428691 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.521337 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerStarted","Data":"e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.528509 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cbf649584-gsrdx" event={"ID":"f017ec2d-5d1b-405c-b2f7-b3212e3696d7","Type":"ContainerStarted","Data":"1868cf1ad9d4c06943ca903108b1faedeae4710002458b554330686c53633d44"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.533002 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-djg6t" event={"ID":"76768b73-31d1-407a-90e7-9583d2b3a773","Type":"ContainerStarted","Data":"005c1a8807f48f2b85fbba453f6c6664a70ca409b6c51dfdab7deae6234c2706"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.534441 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf04584d-e28f-4010-91c0-0dafe5dde54c","Type":"ContainerStarted","Data":"147062aa4bb8bf51b9bbd7b380a54cc4ab79525d6c429e39bf34f18b5c3d35f2"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.545216 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cd88bfbd-zxddf" event={"ID":"abad9fb0-482e-4ed1-8bf5-e738ee946358","Type":"ContainerStarted","Data":"c85eaadb04a5015a0af6f5b45d0cef53dbdbaf4bd7e9981c04a19c8854b66d58"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.551603 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-djg6t" podStartSLOduration=6.396258715 podStartE2EDuration="30.55158651s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="2026-02-25 11:12:15.264889761 +0000 UTC m=+1160.763471786" lastFinishedPulling="2026-02-25 11:12:39.420217546 +0000 UTC m=+1184.918799581" observedRunningTime="2026-02-25 11:12:42.547963874 +0000 UTC m=+1188.046545899" watchObservedRunningTime="2026-02-25 11:12:42.55158651 +0000 UTC m=+1188.050168535" Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.553852 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fzl9q" event={"ID":"cc96366a-6045-408e-9be6-07abc53c1b3e","Type":"ContainerStarted","Data":"4221c2a6df90a5d299315a43805dab436dd16b2d9f7c256685563fd04185ddc6"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.553930 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fzl9q" event={"ID":"cc96366a-6045-408e-9be6-07abc53c1b3e","Type":"ContainerStarted","Data":"fc2fee46b63a08c7dd7e0f74faa4e79d5c8f7c04c5049e56d17ce244ae26e64f"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.557373 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91566ab6-1ac2-4b2b-b705-c049b68e1ab1","Type":"ContainerStarted","Data":"0589e5bcc2fe24c3ccea0736d09bbea4dac750f397d73ccefb8533fa881437e4"} Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.576587 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-66qfw"] Feb 25 11:12:42 crc kubenswrapper[4725]: I0225 11:12:42.605734 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fzl9q" podStartSLOduration=18.605699014 podStartE2EDuration="18.605699014s" podCreationTimestamp="2026-02-25 11:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:42.570560557 +0000 UTC m=+1188.069142612" watchObservedRunningTime="2026-02-25 11:12:42.605699014 +0000 UTC m=+1188.104281039" Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.144347 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b9448d47d-2x4vh"] Feb 25 11:12:43 crc kubenswrapper[4725]: W0225 11:12:43.194135 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36f15650_4f16_4e3b_94cf_a80bcb7c3fde.slice/crio-1f92df5cf8b21ffedbbbc0812b584252f3ff988ffe144b32fea635a3f909cf8a WatchSource:0}: Error finding container 1f92df5cf8b21ffedbbbc0812b584252f3ff988ffe144b32fea635a3f909cf8a: Status 404 returned error can't find the container with id 1f92df5cf8b21ffedbbbc0812b584252f3ff988ffe144b32fea635a3f909cf8a Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.569147 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91566ab6-1ac2-4b2b-b705-c049b68e1ab1","Type":"ContainerStarted","Data":"8b367172e8919f938670f03a6303378703dfbba29b2de04882da1c7955816207"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.579511 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cbf649584-gsrdx" event={"ID":"f017ec2d-5d1b-405c-b2f7-b3212e3696d7","Type":"ContainerStarted","Data":"0570472c8804b33e1c44cbc6cb0a50f1656cd9a8824df607e27b926a7c91564d"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.579586 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cbf649584-gsrdx" event={"ID":"f017ec2d-5d1b-405c-b2f7-b3212e3696d7","Type":"ContainerStarted","Data":"c48a3ca168fe61074e5bf46411fc9696f3e1dfb4e84e48b57f19e1c30a0acf38"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.585527 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf04584d-e28f-4010-91c0-0dafe5dde54c","Type":"ContainerStarted","Data":"607377565ac3041c8ebf6cae37de619f939247e3524af84a69e1df7982db5a95"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.604320 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cbf649584-gsrdx" podStartSLOduration=22.123767282 podStartE2EDuration="22.604301197s" podCreationTimestamp="2026-02-25 11:12:21 +0000 UTC" firstStartedPulling="2026-02-25 11:12:42.016243704 +0000 UTC m=+1187.514825729" lastFinishedPulling="2026-02-25 11:12:42.496777609 +0000 UTC m=+1187.995359644" observedRunningTime="2026-02-25 11:12:43.597572498 +0000 UTC m=+1189.096154523" watchObservedRunningTime="2026-02-25 11:12:43.604301197 +0000 UTC m=+1189.102883232" Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.609349 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cd88bfbd-zxddf" event={"ID":"abad9fb0-482e-4ed1-8bf5-e738ee946358","Type":"ContainerStarted","Data":"071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.609414 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cd88bfbd-zxddf" event={"ID":"abad9fb0-482e-4ed1-8bf5-e738ee946358","Type":"ContainerStarted","Data":"814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.626086 4725 generic.go:334] "Generic (PLEG): container finished" podID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerID="341f2d56520a68078117358a0ff222d8b5adc331235c5d23d73c92f0e6a1f98e" exitCode=0 Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.626164 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" event={"ID":"8bcf915d-87e3-4faf-8875-adeb9f0146af","Type":"ContainerDied","Data":"341f2d56520a68078117358a0ff222d8b5adc331235c5d23d73c92f0e6a1f98e"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.626185 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" event={"ID":"8bcf915d-87e3-4faf-8875-adeb9f0146af","Type":"ContainerStarted","Data":"ca34d6f1705fb62a862645e2cf68ed8dd88ff6b9bafe56400faabd4b2b197e79"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.638917 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64cd88bfbd-zxddf" podStartSLOduration=23.162640339 podStartE2EDuration="23.63889703s" podCreationTimestamp="2026-02-25 11:12:20 +0000 UTC" firstStartedPulling="2026-02-25 11:12:41.855759793 +0000 UTC m=+1187.354341818" lastFinishedPulling="2026-02-25 11:12:42.332016484 +0000 UTC m=+1187.830598509" observedRunningTime="2026-02-25 11:12:43.631615076 +0000 UTC m=+1189.130197111" watchObservedRunningTime="2026-02-25 11:12:43.63889703 +0000 UTC m=+1189.137479055" Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.666864 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9448d47d-2x4vh" event={"ID":"36f15650-4f16-4e3b-94cf-a80bcb7c3fde","Type":"ContainerStarted","Data":"0e837a9df0516ec462c52022c8a572fd94d7b77ba861d5ed648de97086ec1d9b"} Feb 25 11:12:43 crc kubenswrapper[4725]: I0225 11:12:43.666903 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9448d47d-2x4vh" event={"ID":"36f15650-4f16-4e3b-94cf-a80bcb7c3fde","Type":"ContainerStarted","Data":"1f92df5cf8b21ffedbbbc0812b584252f3ff988ffe144b32fea635a3f909cf8a"} Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.087349 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78c8d69889-vkkmw"] Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.089413 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.098187 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.098435 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.098990 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78c8d69889-vkkmw"] Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.162250 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-ovndb-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.162315 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-combined-ca-bundle\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.162336 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-httpd-config\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.162356 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brjvf\" (UniqueName: \"kubernetes.io/projected/762b572a-f761-4bb6-8e01-8ba87c01262c-kube-api-access-brjvf\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.162373 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-config\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.162402 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-public-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.162429 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-internal-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.263796 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-combined-ca-bundle\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.263849 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-httpd-config\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.263877 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brjvf\" (UniqueName: \"kubernetes.io/projected/762b572a-f761-4bb6-8e01-8ba87c01262c-kube-api-access-brjvf\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.263911 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-config\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.263955 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-public-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.263995 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-internal-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.264097 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-ovndb-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.269808 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-httpd-config\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.270421 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-public-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.271700 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-ovndb-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.273072 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-combined-ca-bundle\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.276072 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-internal-tls-certs\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.284657 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-config\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.287625 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brjvf\" (UniqueName: \"kubernetes.io/projected/762b572a-f761-4bb6-8e01-8ba87c01262c-kube-api-access-brjvf\") pod \"neutron-78c8d69889-vkkmw\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.417505 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.746767 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf04584d-e28f-4010-91c0-0dafe5dde54c","Type":"ContainerStarted","Data":"256749a73a6e2107d4f6e5e9d37f972c00d438e33c8c8460bfe4fbcc9346b834"} Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.749041 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" event={"ID":"8bcf915d-87e3-4faf-8875-adeb9f0146af","Type":"ContainerStarted","Data":"e1a6af681df5efd1e55b7fa96a69717eedd2256d3b18d0929498e03b4942b96f"} Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.749556 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.759195 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9448d47d-2x4vh" event={"ID":"36f15650-4f16-4e3b-94cf-a80bcb7c3fde","Type":"ContainerStarted","Data":"8ee55fe701e26882854163396a4b7c2ce444570c5c52397ccaddebebdaabb7ba"} Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.759695 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.774511 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91566ab6-1ac2-4b2b-b705-c049b68e1ab1","Type":"ContainerStarted","Data":"01fe3b1ee2f8aa8ca4385d279b32ba554348f15c838f6ba17a89bae0bc2fb4a5"} Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.782003 4725 generic.go:334] "Generic (PLEG): container finished" podID="76768b73-31d1-407a-90e7-9583d2b3a773" containerID="005c1a8807f48f2b85fbba453f6c6664a70ca409b6c51dfdab7deae6234c2706" exitCode=0 Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.782221 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-djg6t" event={"ID":"76768b73-31d1-407a-90e7-9583d2b3a773","Type":"ContainerDied","Data":"005c1a8807f48f2b85fbba453f6c6664a70ca409b6c51dfdab7deae6234c2706"} Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.806764 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.806734067 podStartE2EDuration="13.806734067s" podCreationTimestamp="2026-02-25 11:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:44.781666888 +0000 UTC m=+1190.280248903" watchObservedRunningTime="2026-02-25 11:12:44.806734067 +0000 UTC m=+1190.305316092" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.863098 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b9448d47d-2x4vh" podStartSLOduration=3.863076219 podStartE2EDuration="3.863076219s" podCreationTimestamp="2026-02-25 11:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:44.811665838 +0000 UTC m=+1190.310247883" watchObservedRunningTime="2026-02-25 11:12:44.863076219 +0000 UTC m=+1190.361658244" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.884115 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.8840879 podStartE2EDuration="4.8840879s" podCreationTimestamp="2026-02-25 11:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:44.839954553 +0000 UTC m=+1190.338536588" watchObservedRunningTime="2026-02-25 11:12:44.8840879 +0000 UTC m=+1190.382669925" Feb 25 11:12:44 crc kubenswrapper[4725]: I0225 11:12:44.896308 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" podStartSLOduration=3.896288855 podStartE2EDuration="3.896288855s" podCreationTimestamp="2026-02-25 11:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:44.884574553 +0000 UTC m=+1190.383156578" watchObservedRunningTime="2026-02-25 11:12:44.896288855 +0000 UTC m=+1190.394870870" Feb 25 11:12:45 crc kubenswrapper[4725]: I0225 11:12:45.130291 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78c8d69889-vkkmw"] Feb 25 11:12:45 crc kubenswrapper[4725]: W0225 11:12:45.144959 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod762b572a_f761_4bb6_8e01_8ba87c01262c.slice/crio-43677c094327e83fa3b5895b3981013eb45fa730bf16bf71a81fd7832055ea35 WatchSource:0}: Error finding container 43677c094327e83fa3b5895b3981013eb45fa730bf16bf71a81fd7832055ea35: Status 404 returned error can't find the container with id 43677c094327e83fa3b5895b3981013eb45fa730bf16bf71a81fd7832055ea35 Feb 25 11:12:45 crc kubenswrapper[4725]: I0225 11:12:45.792848 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c8d69889-vkkmw" event={"ID":"762b572a-f761-4bb6-8e01-8ba87c01262c","Type":"ContainerStarted","Data":"aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a"} Feb 25 11:12:45 crc kubenswrapper[4725]: I0225 11:12:45.792888 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c8d69889-vkkmw" event={"ID":"762b572a-f761-4bb6-8e01-8ba87c01262c","Type":"ContainerStarted","Data":"43677c094327e83fa3b5895b3981013eb45fa730bf16bf71a81fd7832055ea35"} Feb 25 11:12:45 crc kubenswrapper[4725]: I0225 11:12:45.798096 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-skknf" event={"ID":"cf601308-e467-48ee-998c-7a2ecf04d92c","Type":"ContainerStarted","Data":"3ed375a5c0694529b49eecaf54ba821794fc615f8cc55fe46ef417bc85fd8d5b"} Feb 25 11:12:45 crc kubenswrapper[4725]: I0225 11:12:45.818348 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-skknf" podStartSLOduration=3.857439301 podStartE2EDuration="33.818330256s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="2026-02-25 11:12:15.014615683 +0000 UTC m=+1160.513197708" lastFinishedPulling="2026-02-25 11:12:44.975506628 +0000 UTC m=+1190.474088663" observedRunningTime="2026-02-25 11:12:45.809347507 +0000 UTC m=+1191.307929532" watchObservedRunningTime="2026-02-25 11:12:45.818330256 +0000 UTC m=+1191.316912281" Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.825155 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-djg6t" event={"ID":"76768b73-31d1-407a-90e7-9583d2b3a773","Type":"ContainerDied","Data":"4bc48bbcc698e179c27e67436bb42169df64489e353bd9aa7ebb2337d56ae97c"} Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.825686 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc48bbcc698e179c27e67436bb42169df64489e353bd9aa7ebb2337d56ae97c" Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.882418 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.964491 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-scripts\") pod \"76768b73-31d1-407a-90e7-9583d2b3a773\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.964540 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76768b73-31d1-407a-90e7-9583d2b3a773-logs\") pod \"76768b73-31d1-407a-90e7-9583d2b3a773\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.964564 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d76b9\" (UniqueName: \"kubernetes.io/projected/76768b73-31d1-407a-90e7-9583d2b3a773-kube-api-access-d76b9\") pod \"76768b73-31d1-407a-90e7-9583d2b3a773\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.964598 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-config-data\") pod \"76768b73-31d1-407a-90e7-9583d2b3a773\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.964628 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-combined-ca-bundle\") pod \"76768b73-31d1-407a-90e7-9583d2b3a773\" (UID: \"76768b73-31d1-407a-90e7-9583d2b3a773\") " Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.964967 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76768b73-31d1-407a-90e7-9583d2b3a773-logs" (OuterVolumeSpecName: "logs") pod "76768b73-31d1-407a-90e7-9583d2b3a773" (UID: "76768b73-31d1-407a-90e7-9583d2b3a773"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.965455 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76768b73-31d1-407a-90e7-9583d2b3a773-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.969073 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-scripts" (OuterVolumeSpecName: "scripts") pod "76768b73-31d1-407a-90e7-9583d2b3a773" (UID: "76768b73-31d1-407a-90e7-9583d2b3a773"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.988676 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76768b73-31d1-407a-90e7-9583d2b3a773-kube-api-access-d76b9" (OuterVolumeSpecName: "kube-api-access-d76b9") pod "76768b73-31d1-407a-90e7-9583d2b3a773" (UID: "76768b73-31d1-407a-90e7-9583d2b3a773"). InnerVolumeSpecName "kube-api-access-d76b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:47 crc kubenswrapper[4725]: I0225 11:12:47.999290 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76768b73-31d1-407a-90e7-9583d2b3a773" (UID: "76768b73-31d1-407a-90e7-9583d2b3a773"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.066983 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-config-data" (OuterVolumeSpecName: "config-data") pod "76768b73-31d1-407a-90e7-9583d2b3a773" (UID: "76768b73-31d1-407a-90e7-9583d2b3a773"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.067413 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.067441 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d76b9\" (UniqueName: \"kubernetes.io/projected/76768b73-31d1-407a-90e7-9583d2b3a773-kube-api-access-d76b9\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.067454 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.067464 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76768b73-31d1-407a-90e7-9583d2b3a773-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.832929 4725 generic.go:334] "Generic (PLEG): container finished" podID="cc96366a-6045-408e-9be6-07abc53c1b3e" containerID="4221c2a6df90a5d299315a43805dab436dd16b2d9f7c256685563fd04185ddc6" exitCode=0 Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.833206 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-djg6t" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.832991 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fzl9q" event={"ID":"cc96366a-6045-408e-9be6-07abc53c1b3e","Type":"ContainerDied","Data":"4221c2a6df90a5d299315a43805dab436dd16b2d9f7c256685563fd04185ddc6"} Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.985067 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-744d85fb8-vb847"] Feb 25 11:12:48 crc kubenswrapper[4725]: E0225 11:12:48.985453 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76768b73-31d1-407a-90e7-9583d2b3a773" containerName="placement-db-sync" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.985466 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="76768b73-31d1-407a-90e7-9583d2b3a773" containerName="placement-db-sync" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.985662 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="76768b73-31d1-407a-90e7-9583d2b3a773" containerName="placement-db-sync" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.986618 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:48 crc kubenswrapper[4725]: I0225 11:12:48.998256 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-744d85fb8-vb847"] Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.030383 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.031511 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-km7bc" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.031750 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.031975 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.032134 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.086744 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-internal-tls-certs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.086815 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjr5w\" (UniqueName: \"kubernetes.io/projected/1b756696-a908-43f3-8b48-f6ceadb25bb6-kube-api-access-cjr5w\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.086970 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-config-data\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.087025 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b756696-a908-43f3-8b48-f6ceadb25bb6-logs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.087092 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-combined-ca-bundle\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.087119 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-scripts\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.087136 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-public-tls-certs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.188437 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-combined-ca-bundle\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.188490 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-scripts\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.188512 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-public-tls-certs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.188582 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-internal-tls-certs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.188647 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjr5w\" (UniqueName: \"kubernetes.io/projected/1b756696-a908-43f3-8b48-f6ceadb25bb6-kube-api-access-cjr5w\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.189392 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-config-data\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.189453 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b756696-a908-43f3-8b48-f6ceadb25bb6-logs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.189865 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b756696-a908-43f3-8b48-f6ceadb25bb6-logs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.193611 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-config-data\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.200854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-scripts\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.201097 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-internal-tls-certs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.201296 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-public-tls-certs\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.201455 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-combined-ca-bundle\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.203792 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjr5w\" (UniqueName: \"kubernetes.io/projected/1b756696-a908-43f3-8b48-f6ceadb25bb6-kube-api-access-cjr5w\") pod \"placement-744d85fb8-vb847\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:49 crc kubenswrapper[4725]: I0225 11:12:49.346292 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:50 crc kubenswrapper[4725]: I0225 11:12:50.851771 4725 generic.go:334] "Generic (PLEG): container finished" podID="cf601308-e467-48ee-998c-7a2ecf04d92c" containerID="3ed375a5c0694529b49eecaf54ba821794fc615f8cc55fe46ef417bc85fd8d5b" exitCode=0 Feb 25 11:12:50 crc kubenswrapper[4725]: I0225 11:12:50.851874 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-skknf" event={"ID":"cf601308-e467-48ee-998c-7a2ecf04d92c","Type":"ContainerDied","Data":"3ed375a5c0694529b49eecaf54ba821794fc615f8cc55fe46ef417bc85fd8d5b"} Feb 25 11:12:50 crc kubenswrapper[4725]: I0225 11:12:50.922974 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:50 crc kubenswrapper[4725]: I0225 11:12:50.923039 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:50 crc kubenswrapper[4725]: I0225 11:12:50.968714 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:50 crc kubenswrapper[4725]: I0225 11:12:50.978684 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.340237 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.340395 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.405194 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.406772 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.584694 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.660378 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-scripts\") pod \"cc96366a-6045-408e-9be6-07abc53c1b3e\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.660499 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-config-data\") pod \"cc96366a-6045-408e-9be6-07abc53c1b3e\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.660534 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-fernet-keys\") pod \"cc96366a-6045-408e-9be6-07abc53c1b3e\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.660589 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v85vn\" (UniqueName: \"kubernetes.io/projected/cc96366a-6045-408e-9be6-07abc53c1b3e-kube-api-access-v85vn\") pod \"cc96366a-6045-408e-9be6-07abc53c1b3e\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.660621 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-combined-ca-bundle\") pod \"cc96366a-6045-408e-9be6-07abc53c1b3e\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.660692 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-credential-keys\") pod \"cc96366a-6045-408e-9be6-07abc53c1b3e\" (UID: \"cc96366a-6045-408e-9be6-07abc53c1b3e\") " Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.669912 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cc96366a-6045-408e-9be6-07abc53c1b3e" (UID: "cc96366a-6045-408e-9be6-07abc53c1b3e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.670980 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cc96366a-6045-408e-9be6-07abc53c1b3e" (UID: "cc96366a-6045-408e-9be6-07abc53c1b3e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.672224 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc96366a-6045-408e-9be6-07abc53c1b3e-kube-api-access-v85vn" (OuterVolumeSpecName: "kube-api-access-v85vn") pod "cc96366a-6045-408e-9be6-07abc53c1b3e" (UID: "cc96366a-6045-408e-9be6-07abc53c1b3e"). InnerVolumeSpecName "kube-api-access-v85vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.676959 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-scripts" (OuterVolumeSpecName: "scripts") pod "cc96366a-6045-408e-9be6-07abc53c1b3e" (UID: "cc96366a-6045-408e-9be6-07abc53c1b3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.706466 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc96366a-6045-408e-9be6-07abc53c1b3e" (UID: "cc96366a-6045-408e-9be6-07abc53c1b3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.709986 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-config-data" (OuterVolumeSpecName: "config-data") pod "cc96366a-6045-408e-9be6-07abc53c1b3e" (UID: "cc96366a-6045-408e-9be6-07abc53c1b3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.762614 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.762648 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.762660 4725 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.762668 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v85vn\" (UniqueName: \"kubernetes.io/projected/cc96366a-6045-408e-9be6-07abc53c1b3e-kube-api-access-v85vn\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.762679 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.762689 4725 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc96366a-6045-408e-9be6-07abc53c1b3e-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.864786 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fzl9q" event={"ID":"cc96366a-6045-408e-9be6-07abc53c1b3e","Type":"ContainerDied","Data":"fc2fee46b63a08c7dd7e0f74faa4e79d5c8f7c04c5049e56d17ce244ae26e64f"} Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.865152 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc2fee46b63a08c7dd7e0f74faa4e79d5c8f7c04c5049e56d17ce244ae26e64f" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.865245 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fzl9q" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.869705 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c8d69889-vkkmw" event={"ID":"762b572a-f761-4bb6-8e01-8ba87c01262c","Type":"ContainerStarted","Data":"434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11"} Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.870979 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.874613 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerStarted","Data":"c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef"} Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.874815 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.874995 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.876645 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.877456 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.903929 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78c8d69889-vkkmw" podStartSLOduration=7.903896079 podStartE2EDuration="7.903896079s" podCreationTimestamp="2026-02-25 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:51.900990111 +0000 UTC m=+1197.399572186" watchObservedRunningTime="2026-02-25 11:12:51.903896079 +0000 UTC m=+1197.402478104" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.936416 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.966920 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-744d85fb8-vb847"] Feb 25 11:12:51 crc kubenswrapper[4725]: I0225 11:12:51.985217 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.062007 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.167610 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-djnkv"] Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.178398 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" podUID="90402c1e-560a-4551-a218-91d0e04760a4" containerName="dnsmasq-dns" containerID="cri-o://947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959" gracePeriod=10 Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.456246 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.583940 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-combined-ca-bundle\") pod \"cf601308-e467-48ee-998c-7a2ecf04d92c\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.584250 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzvrb\" (UniqueName: \"kubernetes.io/projected/cf601308-e467-48ee-998c-7a2ecf04d92c-kube-api-access-dzvrb\") pod \"cf601308-e467-48ee-998c-7a2ecf04d92c\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.584293 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-db-sync-config-data\") pod \"cf601308-e467-48ee-998c-7a2ecf04d92c\" (UID: \"cf601308-e467-48ee-998c-7a2ecf04d92c\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.589068 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cf601308-e467-48ee-998c-7a2ecf04d92c" (UID: "cf601308-e467-48ee-998c-7a2ecf04d92c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.601310 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf601308-e467-48ee-998c-7a2ecf04d92c-kube-api-access-dzvrb" (OuterVolumeSpecName: "kube-api-access-dzvrb") pod "cf601308-e467-48ee-998c-7a2ecf04d92c" (UID: "cf601308-e467-48ee-998c-7a2ecf04d92c"). InnerVolumeSpecName "kube-api-access-dzvrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.656014 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf601308-e467-48ee-998c-7a2ecf04d92c" (UID: "cf601308-e467-48ee-998c-7a2ecf04d92c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.689505 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.689538 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzvrb\" (UniqueName: \"kubernetes.io/projected/cf601308-e467-48ee-998c-7a2ecf04d92c-kube-api-access-dzvrb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.689573 4725 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf601308-e467-48ee-998c-7a2ecf04d92c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.710677 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7dcb568bf7-chvcs"] Feb 25 11:12:52 crc kubenswrapper[4725]: E0225 11:12:52.711203 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc96366a-6045-408e-9be6-07abc53c1b3e" containerName="keystone-bootstrap" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.711218 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc96366a-6045-408e-9be6-07abc53c1b3e" containerName="keystone-bootstrap" Feb 25 11:12:52 crc kubenswrapper[4725]: E0225 11:12:52.711264 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf601308-e467-48ee-998c-7a2ecf04d92c" containerName="barbican-db-sync" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.711272 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf601308-e467-48ee-998c-7a2ecf04d92c" containerName="barbican-db-sync" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.711458 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc96366a-6045-408e-9be6-07abc53c1b3e" containerName="keystone-bootstrap" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.711497 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf601308-e467-48ee-998c-7a2ecf04d92c" containerName="barbican-db-sync" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.712191 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.714743 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.715974 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.716441 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.716605 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bt58t" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.716734 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.716964 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.736617 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7dcb568bf7-chvcs"] Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.794555 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jx7\" (UniqueName: \"kubernetes.io/projected/8145d393-0967-4acc-bd07-befcc3252202-kube-api-access-l2jx7\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.794667 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-credential-keys\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.794742 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-internal-tls-certs\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.794782 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-fernet-keys\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.794814 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-config-data\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.794879 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-combined-ca-bundle\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.794935 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-public-tls-certs\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.795052 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-scripts\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.820251 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.899629 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2tps\" (UniqueName: \"kubernetes.io/projected/90402c1e-560a-4551-a218-91d0e04760a4-kube-api-access-l2tps\") pod \"90402c1e-560a-4551-a218-91d0e04760a4\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.899684 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-swift-storage-0\") pod \"90402c1e-560a-4551-a218-91d0e04760a4\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.899735 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-config\") pod \"90402c1e-560a-4551-a218-91d0e04760a4\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.899840 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-nb\") pod \"90402c1e-560a-4551-a218-91d0e04760a4\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.899946 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-svc\") pod \"90402c1e-560a-4551-a218-91d0e04760a4\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.899968 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-sb\") pod \"90402c1e-560a-4551-a218-91d0e04760a4\" (UID: \"90402c1e-560a-4551-a218-91d0e04760a4\") " Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900168 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-scripts\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900208 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jx7\" (UniqueName: \"kubernetes.io/projected/8145d393-0967-4acc-bd07-befcc3252202-kube-api-access-l2jx7\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900233 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-credential-keys\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900276 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-internal-tls-certs\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900311 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-fernet-keys\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900326 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-config-data\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900346 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-combined-ca-bundle\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.900369 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-public-tls-certs\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.916490 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-fernet-keys\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.922580 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744d85fb8-vb847" event={"ID":"1b756696-a908-43f3-8b48-f6ceadb25bb6","Type":"ContainerStarted","Data":"b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c"} Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.922657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744d85fb8-vb847" event={"ID":"1b756696-a908-43f3-8b48-f6ceadb25bb6","Type":"ContainerStarted","Data":"195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548"} Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.922682 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.922693 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744d85fb8-vb847" event={"ID":"1b756696-a908-43f3-8b48-f6ceadb25bb6","Type":"ContainerStarted","Data":"2b458c57fa708f7365fde91c901a5ca3811b91388df58200727794339cc2ff1d"} Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.922706 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.925817 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-public-tls-certs\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.926133 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-credential-keys\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.929195 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90402c1e-560a-4551-a218-91d0e04760a4-kube-api-access-l2tps" (OuterVolumeSpecName: "kube-api-access-l2tps") pod "90402c1e-560a-4551-a218-91d0e04760a4" (UID: "90402c1e-560a-4551-a218-91d0e04760a4"). InnerVolumeSpecName "kube-api-access-l2tps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.931146 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jx7\" (UniqueName: \"kubernetes.io/projected/8145d393-0967-4acc-bd07-befcc3252202-kube-api-access-l2jx7\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.932751 4725 generic.go:334] "Generic (PLEG): container finished" podID="90402c1e-560a-4551-a218-91d0e04760a4" containerID="947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959" exitCode=0 Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.932863 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" event={"ID":"90402c1e-560a-4551-a218-91d0e04760a4","Type":"ContainerDied","Data":"947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959"} Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.932893 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" event={"ID":"90402c1e-560a-4551-a218-91d0e04760a4","Type":"ContainerDied","Data":"26051ee387f3121d4741a2e095b3a58f55095f991b106b12b6ad26a90dde0ce0"} Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.932911 4725 scope.go:117] "RemoveContainer" containerID="947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.933051 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-djnkv" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.933306 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-internal-tls-certs\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.936760 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-config-data\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.941244 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-combined-ca-bundle\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.942862 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-skknf" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.947113 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-skknf" event={"ID":"cf601308-e467-48ee-998c-7a2ecf04d92c","Type":"ContainerDied","Data":"a2985c4f57060bdd63a1a81a2c8cdab408999172451a92ab8b1ebad27150b933"} Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.947242 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2985c4f57060bdd63a1a81a2c8cdab408999172451a92ab8b1ebad27150b933" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.948137 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.948193 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.950192 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-744d85fb8-vb847" podStartSLOduration=4.950177293 podStartE2EDuration="4.950177293s" podCreationTimestamp="2026-02-25 11:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:52.946810664 +0000 UTC m=+1198.445392709" watchObservedRunningTime="2026-02-25 11:12:52.950177293 +0000 UTC m=+1198.448759338" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.960621 4725 scope.go:117] "RemoveContainer" containerID="59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db" Feb 25 11:12:52 crc kubenswrapper[4725]: I0225 11:12:52.957604 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8145d393-0967-4acc-bd07-befcc3252202-scripts\") pod \"keystone-7dcb568bf7-chvcs\" (UID: \"8145d393-0967-4acc-bd07-befcc3252202\") " pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.003128 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2tps\" (UniqueName: \"kubernetes.io/projected/90402c1e-560a-4551-a218-91d0e04760a4-kube-api-access-l2tps\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.003764 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-config" (OuterVolumeSpecName: "config") pod "90402c1e-560a-4551-a218-91d0e04760a4" (UID: "90402c1e-560a-4551-a218-91d0e04760a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.026913 4725 scope.go:117] "RemoveContainer" containerID="947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959" Feb 25 11:12:53 crc kubenswrapper[4725]: E0225 11:12:53.051897 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959\": container with ID starting with 947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959 not found: ID does not exist" containerID="947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.051939 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959"} err="failed to get container status \"947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959\": rpc error: code = NotFound desc = could not find container \"947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959\": container with ID starting with 947a8dd38aba801ad866ece1036a9614acee44dfe2c8d18bb41cd82697802959 not found: ID does not exist" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.051963 4725 scope.go:117] "RemoveContainer" containerID="59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.052748 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90402c1e-560a-4551-a218-91d0e04760a4" (UID: "90402c1e-560a-4551-a218-91d0e04760a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:53 crc kubenswrapper[4725]: E0225 11:12:53.052979 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db\": container with ID starting with 59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db not found: ID does not exist" containerID="59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.053002 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db"} err="failed to get container status \"59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db\": rpc error: code = NotFound desc = could not find container \"59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db\": container with ID starting with 59d8d22005681ef80ee71fba0d4f5fd479cdb169de3bb023c8a6584ff62fd6db not found: ID does not exist" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.055732 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.074003 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6df8d5688f-fkmbb"] Feb 25 11:12:53 crc kubenswrapper[4725]: E0225 11:12:53.074555 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90402c1e-560a-4551-a218-91d0e04760a4" containerName="dnsmasq-dns" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.074626 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="90402c1e-560a-4551-a218-91d0e04760a4" containerName="dnsmasq-dns" Feb 25 11:12:53 crc kubenswrapper[4725]: E0225 11:12:53.074696 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90402c1e-560a-4551-a218-91d0e04760a4" containerName="init" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.074752 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="90402c1e-560a-4551-a218-91d0e04760a4" containerName="init" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.075003 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="90402c1e-560a-4551-a218-91d0e04760a4" containerName="dnsmasq-dns" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.076344 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.083241 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.083422 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.083951 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tq92j" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.102472 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90402c1e-560a-4551-a218-91d0e04760a4" (UID: "90402c1e-560a-4551-a218-91d0e04760a4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.104729 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-config-data-custom\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.104872 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-config-data\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.104914 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-689bk\" (UniqueName: \"kubernetes.io/projected/09976716-81ab-4d43-8250-fe3812bc8029-kube-api-access-689bk\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.104931 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-combined-ca-bundle\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.104962 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09976716-81ab-4d43-8250-fe3812bc8029-logs\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.110737 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.127084 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.111212 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.127500 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.127510 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.139254 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.146485 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90402c1e-560a-4551-a218-91d0e04760a4" (UID: "90402c1e-560a-4551-a218-91d0e04760a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.147669 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6df8d5688f-fkmbb"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.162169 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.195690 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90402c1e-560a-4551-a218-91d0e04760a4" (UID: "90402c1e-560a-4551-a218-91d0e04760a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.198061 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-69c7668f4d-s7tf6"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.202610 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.249812 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-config-data-custom\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.249895 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-config-data\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.249947 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-689bk\" (UniqueName: \"kubernetes.io/projected/09976716-81ab-4d43-8250-fe3812bc8029-kube-api-access-689bk\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.249971 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-combined-ca-bundle\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250003 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-config-data\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250035 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09976716-81ab-4d43-8250-fe3812bc8029-logs\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250058 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-combined-ca-bundle\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250092 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-config-data-custom\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250111 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dx4k\" (UniqueName: \"kubernetes.io/projected/b77182d3-74cf-4a61-a3a1-81efff62da8d-kube-api-access-2dx4k\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250146 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b77182d3-74cf-4a61-a3a1-81efff62da8d-logs\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250188 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250198 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90402c1e-560a-4551-a218-91d0e04760a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.250810 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09976716-81ab-4d43-8250-fe3812bc8029-logs\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.265806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-config-data\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.274400 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-combined-ca-bundle\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.285407 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09976716-81ab-4d43-8250-fe3812bc8029-config-data-custom\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.303219 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-689bk\" (UniqueName: \"kubernetes.io/projected/09976716-81ab-4d43-8250-fe3812bc8029-kube-api-access-689bk\") pod \"barbican-worker-6df8d5688f-fkmbb\" (UID: \"09976716-81ab-4d43-8250-fe3812bc8029\") " pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.351208 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-internal-tls-certs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357161 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-scripts\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357289 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwqdf\" (UniqueName: \"kubernetes.io/projected/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-kube-api-access-hwqdf\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357368 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-combined-ca-bundle\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357484 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-config-data\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357594 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-public-tls-certs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357676 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dx4k\" (UniqueName: \"kubernetes.io/projected/b77182d3-74cf-4a61-a3a1-81efff62da8d-kube-api-access-2dx4k\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357795 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-logs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.357948 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b77182d3-74cf-4a61-a3a1-81efff62da8d-logs\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.358080 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-config-data-custom\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.358281 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-combined-ca-bundle\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.358404 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-config-data\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.356030 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69c7668f4d-s7tf6"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.359267 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b77182d3-74cf-4a61-a3a1-81efff62da8d-logs\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.377103 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-config-data\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.385386 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dx4k\" (UniqueName: \"kubernetes.io/projected/b77182d3-74cf-4a61-a3a1-81efff62da8d-kube-api-access-2dx4k\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.403466 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-combined-ca-bundle\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.409703 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b77182d3-74cf-4a61-a3a1-81efff62da8d-config-data-custom\") pod \"barbican-keystone-listener-5b8b9cdb6b-d9zj4\" (UID: \"b77182d3-74cf-4a61-a3a1-81efff62da8d\") " pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.448308 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-lvpd8"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.449180 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6df8d5688f-fkmbb" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.451713 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.461263 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-config-data\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.461307 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-public-tls-certs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.461349 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-logs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.461411 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-combined-ca-bundle\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.461442 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-internal-tls-certs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.461460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-scripts\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.461486 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwqdf\" (UniqueName: \"kubernetes.io/projected/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-kube-api-access-hwqdf\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.467716 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-config-data\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.472386 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-logs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.472906 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-combined-ca-bundle\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.475718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-scripts\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.477938 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-public-tls-certs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.484362 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwqdf\" (UniqueName: \"kubernetes.io/projected/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-kube-api-access-hwqdf\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.484435 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-lvpd8"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.489453 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/502da0ce-a7f4-4af1-87a8-f9a7bb197b39-internal-tls-certs\") pod \"placement-69c7668f4d-s7tf6\" (UID: \"502da0ce-a7f4-4af1-87a8-f9a7bb197b39\") " pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.494120 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.501912 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-djnkv"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.508998 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-djnkv"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.525005 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57988f9b54-kk5lw"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.526354 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.535907 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57988f9b54-kk5lw"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.548252 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.564040 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-config\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.564126 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c25p4\" (UniqueName: \"kubernetes.io/projected/0f610676-8c4b-4152-bfe0-5d1ccf467671-kube-api-access-c25p4\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.564209 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.564243 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.564299 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-svc\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.564327 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.589478 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.666553 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-svc\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667115 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667197 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d640b6-86ab-4476-b233-dc7a95f5076c-logs\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667266 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data-custom\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667309 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667340 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-config\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667368 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-combined-ca-bundle\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667417 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwwns\" (UniqueName: \"kubernetes.io/projected/73d640b6-86ab-4476-b233-dc7a95f5076c-kube-api-access-cwwns\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667450 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c25p4\" (UniqueName: \"kubernetes.io/projected/0f610676-8c4b-4152-bfe0-5d1ccf467671-kube-api-access-c25p4\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667515 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.667539 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.668397 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.669105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-svc\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.669647 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.670339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-config\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.671226 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.696670 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c25p4\" (UniqueName: \"kubernetes.io/projected/0f610676-8c4b-4152-bfe0-5d1ccf467671-kube-api-access-c25p4\") pod \"dnsmasq-dns-85ff748b95-lvpd8\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.773651 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d640b6-86ab-4476-b233-dc7a95f5076c-logs\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.774141 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data-custom\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.774272 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.774329 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-combined-ca-bundle\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.774354 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwwns\" (UniqueName: \"kubernetes.io/projected/73d640b6-86ab-4476-b233-dc7a95f5076c-kube-api-access-cwwns\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.774538 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d640b6-86ab-4476-b233-dc7a95f5076c-logs\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.783447 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-combined-ca-bundle\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.788528 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.789231 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data-custom\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.807157 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwwns\" (UniqueName: \"kubernetes.io/projected/73d640b6-86ab-4476-b233-dc7a95f5076c-kube-api-access-cwwns\") pod \"barbican-api-57988f9b54-kk5lw\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.897006 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.900522 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7dcb568bf7-chvcs"] Feb 25 11:12:53 crc kubenswrapper[4725]: I0225 11:12:53.930740 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.011859 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7dcb568bf7-chvcs" event={"ID":"8145d393-0967-4acc-bd07-befcc3252202","Type":"ContainerStarted","Data":"5d53dc352ba7aaabbc0f67300f2cac8b654b76f4ea11efef8265f759ceda417a"} Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.231814 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4"] Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.236569 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6df8d5688f-fkmbb"] Feb 25 11:12:54 crc kubenswrapper[4725]: W0225 11:12:54.243986 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb77182d3_74cf_4a61_a3a1_81efff62da8d.slice/crio-c2b82e37487395c1d869ada2dffee895c155233203e05ef8b663829c5f138125 WatchSource:0}: Error finding container c2b82e37487395c1d869ada2dffee895c155233203e05ef8b663829c5f138125: Status 404 returned error can't find the container with id c2b82e37487395c1d869ada2dffee895c155233203e05ef8b663829c5f138125 Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.399725 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69c7668f4d-s7tf6"] Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.418259 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.418358 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.542119 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-lvpd8"] Feb 25 11:12:54 crc kubenswrapper[4725]: I0225 11:12:54.652475 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57988f9b54-kk5lw"] Feb 25 11:12:54 crc kubenswrapper[4725]: W0225 11:12:54.673001 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73d640b6_86ab_4476_b233_dc7a95f5076c.slice/crio-3a061ec8faa029ffe01e0dff15ae0233e578f35a3694b99e886a1b944825768e WatchSource:0}: Error finding container 3a061ec8faa029ffe01e0dff15ae0233e578f35a3694b99e886a1b944825768e: Status 404 returned error can't find the container with id 3a061ec8faa029ffe01e0dff15ae0233e578f35a3694b99e886a1b944825768e Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.039125 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7dcb568bf7-chvcs" event={"ID":"8145d393-0967-4acc-bd07-befcc3252202","Type":"ContainerStarted","Data":"99d14d40131e51c29c4283e490fb62d1e23e2b80e41e21c6dee46402e9855854"} Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.041728 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.043077 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6df8d5688f-fkmbb" event={"ID":"09976716-81ab-4d43-8250-fe3812bc8029","Type":"ContainerStarted","Data":"1d641b44de51c9f019cea7fb188cc773ac55ba1c18fe49dbded4db3e35612a2c"} Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.044940 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57988f9b54-kk5lw" event={"ID":"73d640b6-86ab-4476-b233-dc7a95f5076c","Type":"ContainerStarted","Data":"3a061ec8faa029ffe01e0dff15ae0233e578f35a3694b99e886a1b944825768e"} Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.046890 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" event={"ID":"b77182d3-74cf-4a61-a3a1-81efff62da8d","Type":"ContainerStarted","Data":"c2b82e37487395c1d869ada2dffee895c155233203e05ef8b663829c5f138125"} Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.068378 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69c7668f4d-s7tf6" event={"ID":"502da0ce-a7f4-4af1-87a8-f9a7bb197b39","Type":"ContainerStarted","Data":"ca0246a522a7ae0209980791c7873e661c5be6a52469d5fca02a309a1f334651"} Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.068419 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69c7668f4d-s7tf6" event={"ID":"502da0ce-a7f4-4af1-87a8-f9a7bb197b39","Type":"ContainerStarted","Data":"55dafd87de3959c99133aecd0e7c3d3f5a2b5c9d549181e8af4261a396316241"} Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.070790 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7dcb568bf7-chvcs" podStartSLOduration=3.07077898 podStartE2EDuration="3.07077898s" podCreationTimestamp="2026-02-25 11:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:55.055739389 +0000 UTC m=+1200.554321414" watchObservedRunningTime="2026-02-25 11:12:55.07077898 +0000 UTC m=+1200.569360995" Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.073728 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.073745 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.074855 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" event={"ID":"0f610676-8c4b-4152-bfe0-5d1ccf467671","Type":"ContainerStarted","Data":"809fefd63171c7e78f71d003acf2091f25ebd279492530a290f60b38b24be93f"} Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.358117 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90402c1e-560a-4551-a218-91d0e04760a4" path="/var/lib/kubelet/pods/90402c1e-560a-4551-a218-91d0e04760a4/volumes" Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.362853 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 11:12:55 crc kubenswrapper[4725]: I0225 11:12:55.777671 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.099389 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7mk8j" event={"ID":"afe5daf6-23bb-4480-8bd7-724dbb47ad3d","Type":"ContainerStarted","Data":"ce68badea7af2996cd0c81c8799313c0b2dc12722be1709a23bbb44d9a9f890c"} Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.119595 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7mk8j" podStartSLOduration=4.332395406 podStartE2EDuration="44.119575252s" podCreationTimestamp="2026-02-25 11:12:12 +0000 UTC" firstStartedPulling="2026-02-25 11:12:15.029395624 +0000 UTC m=+1160.527977639" lastFinishedPulling="2026-02-25 11:12:54.81657546 +0000 UTC m=+1200.315157485" observedRunningTime="2026-02-25 11:12:56.118193505 +0000 UTC m=+1201.616775530" watchObservedRunningTime="2026-02-25 11:12:56.119575252 +0000 UTC m=+1201.618157297" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.127506 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69c7668f4d-s7tf6" event={"ID":"502da0ce-a7f4-4af1-87a8-f9a7bb197b39","Type":"ContainerStarted","Data":"b6f8517783901cb5740f8c066df33cd14b4c7f3b1190097c157f6008d9cb62ca"} Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.128779 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.128809 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.130863 4725 generic.go:334] "Generic (PLEG): container finished" podID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerID="661e3bb5faa3087c218ba03b3a0d35a4416f1e5cea7198249aedd5df77bf7178" exitCode=0 Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.130920 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" event={"ID":"0f610676-8c4b-4152-bfe0-5d1ccf467671","Type":"ContainerDied","Data":"661e3bb5faa3087c218ba03b3a0d35a4416f1e5cea7198249aedd5df77bf7178"} Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.156335 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-69c7668f4d-s7tf6" podStartSLOduration=3.1563178020000002 podStartE2EDuration="3.156317802s" podCreationTimestamp="2026-02-25 11:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:56.148407801 +0000 UTC m=+1201.646989856" watchObservedRunningTime="2026-02-25 11:12:56.156317802 +0000 UTC m=+1201.654899817" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.159149 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.160630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57988f9b54-kk5lw" event={"ID":"73d640b6-86ab-4476-b233-dc7a95f5076c","Type":"ContainerStarted","Data":"ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340"} Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.160667 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.160681 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57988f9b54-kk5lw" event={"ID":"73d640b6-86ab-4476-b233-dc7a95f5076c","Type":"ContainerStarted","Data":"6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9"} Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.161246 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.203645 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84dc96ccc8-zhwrq"] Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.205588 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.206526 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57988f9b54-kk5lw" podStartSLOduration=3.206511161 podStartE2EDuration="3.206511161s" podCreationTimestamp="2026-02-25 11:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:56.192963989 +0000 UTC m=+1201.691546034" watchObservedRunningTime="2026-02-25 11:12:56.206511161 +0000 UTC m=+1201.705093186" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.210310 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.210540 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.232220 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84dc96ccc8-zhwrq"] Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.322591 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-config-data-custom\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.322713 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-logs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.322734 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-public-tls-certs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.322750 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqppg\" (UniqueName: \"kubernetes.io/projected/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-kube-api-access-vqppg\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.322844 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-internal-tls-certs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.322910 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-config-data\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.323001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-combined-ca-bundle\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.425926 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-combined-ca-bundle\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.426017 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-config-data-custom\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.426059 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-logs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.426081 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-public-tls-certs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.426099 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqppg\" (UniqueName: \"kubernetes.io/projected/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-kube-api-access-vqppg\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.426161 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-internal-tls-certs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.426209 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-config-data\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.427091 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-logs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.432812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-config-data-custom\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.434090 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-public-tls-certs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.434309 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-internal-tls-certs\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.434862 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-combined-ca-bundle\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.437272 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-config-data\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.442063 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqppg\" (UniqueName: \"kubernetes.io/projected/b4ab7d45-3a36-4ffc-9004-62ff70fbfe53-kube-api-access-vqppg\") pod \"barbican-api-84dc96ccc8-zhwrq\" (UID: \"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53\") " pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.532753 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:12:56 crc kubenswrapper[4725]: I0225 11:12:56.641912 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 11:12:57 crc kubenswrapper[4725]: I0225 11:12:57.961989 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84dc96ccc8-zhwrq"] Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.181772 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6df8d5688f-fkmbb" event={"ID":"09976716-81ab-4d43-8250-fe3812bc8029","Type":"ContainerStarted","Data":"172884b79c690673db12382df9b473e378e93864c5ddb907a82138cd9991d8d9"} Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.182146 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6df8d5688f-fkmbb" event={"ID":"09976716-81ab-4d43-8250-fe3812bc8029","Type":"ContainerStarted","Data":"29cb84e6ec01f78bd2ad7707e57e9870a9c39fe3a6c4b73da206572ed88ede24"} Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.186618 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" event={"ID":"b77182d3-74cf-4a61-a3a1-81efff62da8d","Type":"ContainerStarted","Data":"1c53a7f1f51c5d9e0505b6573328b70ea59213f5c09c1cfdc0d768978f8a1d41"} Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.186675 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" event={"ID":"b77182d3-74cf-4a61-a3a1-81efff62da8d","Type":"ContainerStarted","Data":"6edbc2cdd59be851385c9af955115575453ebe94dfebdc0b5d3df94ac441882f"} Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.188128 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84dc96ccc8-zhwrq" event={"ID":"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53","Type":"ContainerStarted","Data":"3905f8eda0d4991acb8a150e657106115f3528fc5ee0fa6856ff500ec4c84edb"} Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.188155 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84dc96ccc8-zhwrq" event={"ID":"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53","Type":"ContainerStarted","Data":"3b21501bf6fa3bff9ef11616853bdc100a4d4575d0e0e86833a8821cae7c1a56"} Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.193956 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" event={"ID":"0f610676-8c4b-4152-bfe0-5d1ccf467671","Type":"ContainerStarted","Data":"7eaeb2c06bc7d872db3845efe5d7ead5820d5d640b75ec0c0163e14ba2294dfc"} Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.218228 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6df8d5688f-fkmbb" podStartSLOduration=1.9723227639999998 podStartE2EDuration="5.218208013s" podCreationTimestamp="2026-02-25 11:12:53 +0000 UTC" firstStartedPulling="2026-02-25 11:12:54.255345932 +0000 UTC m=+1199.753927947" lastFinishedPulling="2026-02-25 11:12:57.501231171 +0000 UTC m=+1202.999813196" observedRunningTime="2026-02-25 11:12:58.209213033 +0000 UTC m=+1203.707795058" watchObservedRunningTime="2026-02-25 11:12:58.218208013 +0000 UTC m=+1203.716790038" Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.244066 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" podStartSLOduration=5.244045212 podStartE2EDuration="5.244045212s" podCreationTimestamp="2026-02-25 11:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:58.234107237 +0000 UTC m=+1203.732689262" watchObservedRunningTime="2026-02-25 11:12:58.244045212 +0000 UTC m=+1203.742627247" Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.264681 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b8b9cdb6b-d9zj4" podStartSLOduration=2.039571418 podStartE2EDuration="5.264663382s" podCreationTimestamp="2026-02-25 11:12:53 +0000 UTC" firstStartedPulling="2026-02-25 11:12:54.245929021 +0000 UTC m=+1199.744511046" lastFinishedPulling="2026-02-25 11:12:57.471020985 +0000 UTC m=+1202.969603010" observedRunningTime="2026-02-25 11:12:58.260905982 +0000 UTC m=+1203.759488017" watchObservedRunningTime="2026-02-25 11:12:58.264663382 +0000 UTC m=+1203.763245407" Feb 25 11:12:58 crc kubenswrapper[4725]: I0225 11:12:58.897758 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:12:59 crc kubenswrapper[4725]: I0225 11:12:59.203384 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84dc96ccc8-zhwrq" event={"ID":"b4ab7d45-3a36-4ffc-9004-62ff70fbfe53","Type":"ContainerStarted","Data":"845e6e2cdd0e76aa0811eb584e8514a9b1951249d93deb728553c2c0d80388b0"} Feb 25 11:12:59 crc kubenswrapper[4725]: I0225 11:12:59.238711 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84dc96ccc8-zhwrq" podStartSLOduration=3.23869553 podStartE2EDuration="3.23869553s" podCreationTimestamp="2026-02-25 11:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:12:59.223395392 +0000 UTC m=+1204.721977437" watchObservedRunningTime="2026-02-25 11:12:59.23869553 +0000 UTC m=+1204.737277555" Feb 25 11:13:00 crc kubenswrapper[4725]: I0225 11:13:00.211302 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:13:00 crc kubenswrapper[4725]: I0225 11:13:00.211345 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:13:01 crc kubenswrapper[4725]: I0225 11:13:01.219366 4725 generic.go:334] "Generic (PLEG): container finished" podID="afe5daf6-23bb-4480-8bd7-724dbb47ad3d" containerID="ce68badea7af2996cd0c81c8799313c0b2dc12722be1709a23bbb44d9a9f890c" exitCode=0 Feb 25 11:13:01 crc kubenswrapper[4725]: I0225 11:13:01.220599 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7mk8j" event={"ID":"afe5daf6-23bb-4480-8bd7-724dbb47ad3d","Type":"ContainerDied","Data":"ce68badea7af2996cd0c81c8799313c0b2dc12722be1709a23bbb44d9a9f890c"} Feb 25 11:13:01 crc kubenswrapper[4725]: I0225 11:13:01.320898 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64cd88bfbd-zxddf" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 25 11:13:01 crc kubenswrapper[4725]: I0225 11:13:01.407484 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7cbf649584-gsrdx" podUID="f017ec2d-5d1b-405c-b2f7-b3212e3696d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 25 11:13:03 crc kubenswrapper[4725]: I0225 11:13:03.899551 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:13:04 crc kubenswrapper[4725]: I0225 11:13:04.011315 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-66qfw"] Feb 25 11:13:04 crc kubenswrapper[4725]: I0225 11:13:04.011561 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" podUID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerName="dnsmasq-dns" containerID="cri-o://e1a6af681df5efd1e55b7fa96a69717eedd2256d3b18d0929498e03b4942b96f" gracePeriod=10 Feb 25 11:13:04 crc kubenswrapper[4725]: I0225 11:13:04.265301 4725 generic.go:334] "Generic (PLEG): container finished" podID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerID="e1a6af681df5efd1e55b7fa96a69717eedd2256d3b18d0929498e03b4942b96f" exitCode=0 Feb 25 11:13:04 crc kubenswrapper[4725]: I0225 11:13:04.265359 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" event={"ID":"8bcf915d-87e3-4faf-8875-adeb9f0146af","Type":"ContainerDied","Data":"e1a6af681df5efd1e55b7fa96a69717eedd2256d3b18d0929498e03b4942b96f"} Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.542798 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.732045 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.871003 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.941002 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-db-sync-config-data\") pod \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.941224 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-config-data\") pod \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.941346 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-etc-machine-id\") pod \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.941448 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-combined-ca-bundle\") pod \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.941562 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ppsg\" (UniqueName: \"kubernetes.io/projected/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-kube-api-access-2ppsg\") pod \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.941662 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-scripts\") pod \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\" (UID: \"afe5daf6-23bb-4480-8bd7-724dbb47ad3d\") " Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.941876 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "afe5daf6-23bb-4480-8bd7-724dbb47ad3d" (UID: "afe5daf6-23bb-4480-8bd7-724dbb47ad3d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.942942 4725 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.953266 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "afe5daf6-23bb-4480-8bd7-724dbb47ad3d" (UID: "afe5daf6-23bb-4480-8bd7-724dbb47ad3d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.953687 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-scripts" (OuterVolumeSpecName: "scripts") pod "afe5daf6-23bb-4480-8bd7-724dbb47ad3d" (UID: "afe5daf6-23bb-4480-8bd7-724dbb47ad3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:05 crc kubenswrapper[4725]: I0225 11:13:05.963663 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-kube-api-access-2ppsg" (OuterVolumeSpecName: "kube-api-access-2ppsg") pod "afe5daf6-23bb-4480-8bd7-724dbb47ad3d" (UID: "afe5daf6-23bb-4480-8bd7-724dbb47ad3d"). InnerVolumeSpecName "kube-api-access-2ppsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.022540 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afe5daf6-23bb-4480-8bd7-724dbb47ad3d" (UID: "afe5daf6-23bb-4480-8bd7-724dbb47ad3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.048791 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ppsg\" (UniqueName: \"kubernetes.io/projected/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-kube-api-access-2ppsg\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.048820 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.048848 4725 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.048858 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.068204 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-config-data" (OuterVolumeSpecName: "config-data") pod "afe5daf6-23bb-4480-8bd7-724dbb47ad3d" (UID: "afe5daf6-23bb-4480-8bd7-724dbb47ad3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.150991 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe5daf6-23bb-4480-8bd7-724dbb47ad3d-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: E0225 11:13:06.189942 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.199848 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.287746 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerStarted","Data":"f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814"} Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.287971 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="ceilometer-notification-agent" containerID="cri-o://e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c" gracePeriod=30 Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.288270 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.288541 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="proxy-httpd" containerID="cri-o://f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814" gracePeriod=30 Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.288597 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="sg-core" containerID="cri-o://c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef" gracePeriod=30 Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.297688 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7mk8j" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.298191 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7mk8j" event={"ID":"afe5daf6-23bb-4480-8bd7-724dbb47ad3d","Type":"ContainerDied","Data":"9c6014acdab9674f9313d281a65976fa14b19991eedc63917ece3d5dab9d691a"} Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.298228 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c6014acdab9674f9313d281a65976fa14b19991eedc63917ece3d5dab9d691a" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.318483 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.318937 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-66qfw" event={"ID":"8bcf915d-87e3-4faf-8875-adeb9f0146af","Type":"ContainerDied","Data":"ca34d6f1705fb62a862645e2cf68ed8dd88ff6b9bafe56400faabd4b2b197e79"} Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.318975 4725 scope.go:117] "RemoveContainer" containerID="e1a6af681df5efd1e55b7fa96a69717eedd2256d3b18d0929498e03b4942b96f" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.336229 4725 scope.go:117] "RemoveContainer" containerID="341f2d56520a68078117358a0ff222d8b5adc331235c5d23d73c92f0e6a1f98e" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.353735 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-config\") pod \"8bcf915d-87e3-4faf-8875-adeb9f0146af\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.353837 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-sb\") pod \"8bcf915d-87e3-4faf-8875-adeb9f0146af\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.353866 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kdvg\" (UniqueName: \"kubernetes.io/projected/8bcf915d-87e3-4faf-8875-adeb9f0146af-kube-api-access-7kdvg\") pod \"8bcf915d-87e3-4faf-8875-adeb9f0146af\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.353958 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-nb\") pod \"8bcf915d-87e3-4faf-8875-adeb9f0146af\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.354024 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-swift-storage-0\") pod \"8bcf915d-87e3-4faf-8875-adeb9f0146af\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.354110 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-svc\") pod \"8bcf915d-87e3-4faf-8875-adeb9f0146af\" (UID: \"8bcf915d-87e3-4faf-8875-adeb9f0146af\") " Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.359491 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcf915d-87e3-4faf-8875-adeb9f0146af-kube-api-access-7kdvg" (OuterVolumeSpecName: "kube-api-access-7kdvg") pod "8bcf915d-87e3-4faf-8875-adeb9f0146af" (UID: "8bcf915d-87e3-4faf-8875-adeb9f0146af"). InnerVolumeSpecName "kube-api-access-7kdvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.415285 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bcf915d-87e3-4faf-8875-adeb9f0146af" (UID: "8bcf915d-87e3-4faf-8875-adeb9f0146af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.418726 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-config" (OuterVolumeSpecName: "config") pod "8bcf915d-87e3-4faf-8875-adeb9f0146af" (UID: "8bcf915d-87e3-4faf-8875-adeb9f0146af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.422737 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bcf915d-87e3-4faf-8875-adeb9f0146af" (UID: "8bcf915d-87e3-4faf-8875-adeb9f0146af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.431755 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bcf915d-87e3-4faf-8875-adeb9f0146af" (UID: "8bcf915d-87e3-4faf-8875-adeb9f0146af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.436070 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bcf915d-87e3-4faf-8875-adeb9f0146af" (UID: "8bcf915d-87e3-4faf-8875-adeb9f0146af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.456901 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.456926 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.456936 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.456949 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kdvg\" (UniqueName: \"kubernetes.io/projected/8bcf915d-87e3-4faf-8875-adeb9f0146af-kube-api-access-7kdvg\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.456958 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.456967 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bcf915d-87e3-4faf-8875-adeb9f0146af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.650183 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-66qfw"] Feb 25 11:13:06 crc kubenswrapper[4725]: I0225 11:13:06.657493 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-66qfw"] Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.193427 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:07 crc kubenswrapper[4725]: E0225 11:13:07.193986 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe5daf6-23bb-4480-8bd7-724dbb47ad3d" containerName="cinder-db-sync" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.194003 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe5daf6-23bb-4480-8bd7-724dbb47ad3d" containerName="cinder-db-sync" Feb 25 11:13:07 crc kubenswrapper[4725]: E0225 11:13:07.194019 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerName="init" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.194026 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerName="init" Feb 25 11:13:07 crc kubenswrapper[4725]: E0225 11:13:07.194038 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerName="dnsmasq-dns" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.194043 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerName="dnsmasq-dns" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.194188 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bcf915d-87e3-4faf-8875-adeb9f0146af" containerName="dnsmasq-dns" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.194209 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe5daf6-23bb-4480-8bd7-724dbb47ad3d" containerName="cinder-db-sync" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.198645 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.203892 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tf7vm" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.205747 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.205929 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.209707 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.257120 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcf915d-87e3-4faf-8875-adeb9f0146af" path="/var/lib/kubelet/pods/8bcf915d-87e3-4faf-8875-adeb9f0146af/volumes" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.257847 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.272754 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.272842 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9v5h\" (UniqueName: \"kubernetes.io/projected/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-kube-api-access-p9v5h\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.272882 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.272900 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.272948 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-4ntbw"] Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.273038 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.273115 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.274409 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.282227 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-4ntbw"] Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.360577 4725 generic.go:334] "Generic (PLEG): container finished" podID="7492d83b-6fd0-420c-99a5-19caedc41981" containerID="f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814" exitCode=0 Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.360628 4725 generic.go:334] "Generic (PLEG): container finished" podID="7492d83b-6fd0-420c-99a5-19caedc41981" containerID="c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef" exitCode=2 Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.360648 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerDied","Data":"f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814"} Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.360692 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerDied","Data":"c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef"} Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376272 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376345 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376416 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376670 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9v5h\" (UniqueName: \"kubernetes.io/projected/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-kube-api-access-p9v5h\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376747 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376780 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376810 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376890 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.376922 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.377011 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.377071 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.377103 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2djrx\" (UniqueName: \"kubernetes.io/projected/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-kube-api-access-2djrx\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.378843 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.393629 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.399551 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.401535 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.409515 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9v5h\" (UniqueName: \"kubernetes.io/projected/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-kube-api-access-p9v5h\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.412718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.445662 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.447188 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.455486 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.459708 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.490361 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.490476 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.490543 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.490600 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.490635 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.490752 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2djrx\" (UniqueName: \"kubernetes.io/projected/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-kube-api-access-2djrx\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.491380 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.491747 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.491882 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.492237 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.492725 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.515886 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2djrx\" (UniqueName: \"kubernetes.io/projected/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-kube-api-access-2djrx\") pod \"dnsmasq-dns-5c9776ccc5-4ntbw\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.527769 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.593051 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phc4\" (UniqueName: \"kubernetes.io/projected/1f38e78f-45de-4061-8fc4-561318e984dd-kube-api-access-5phc4\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.593135 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f38e78f-45de-4061-8fc4-561318e984dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.593168 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.593227 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.593260 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-scripts\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.593294 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f38e78f-45de-4061-8fc4-561318e984dd-logs\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.593348 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.614281 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.698911 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.698949 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-scripts\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.699000 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f38e78f-45de-4061-8fc4-561318e984dd-logs\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.699056 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.699117 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phc4\" (UniqueName: \"kubernetes.io/projected/1f38e78f-45de-4061-8fc4-561318e984dd-kube-api-access-5phc4\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.699155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f38e78f-45de-4061-8fc4-561318e984dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.699173 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.708290 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f38e78f-45de-4061-8fc4-561318e984dd-logs\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.715313 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f38e78f-45de-4061-8fc4-561318e984dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.723605 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.737573 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-scripts\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.738672 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.767542 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.790322 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phc4\" (UniqueName: \"kubernetes.io/projected/1f38e78f-45de-4061-8fc4-561318e984dd-kube-api-access-5phc4\") pod \"cinder-api-0\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.815329 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 11:13:07 crc kubenswrapper[4725]: I0225 11:13:07.952570 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.269139 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-4ntbw"] Feb 25 11:13:08 crc kubenswrapper[4725]: W0225 11:13:08.276736 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc667f48_ac1c_4c0a_8e15_25c7adb7e6a5.slice/crio-e211ae86272138c295603fc9beaf5a588b4b4ede7f7335302dd2b2169c2852c4 WatchSource:0}: Error finding container e211ae86272138c295603fc9beaf5a588b4b4ede7f7335302dd2b2169c2852c4: Status 404 returned error can't find the container with id e211ae86272138c295603fc9beaf5a588b4b4ede7f7335302dd2b2169c2852c4 Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.315958 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.370059 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" event={"ID":"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5","Type":"ContainerStarted","Data":"e211ae86272138c295603fc9beaf5a588b4b4ede7f7335302dd2b2169c2852c4"} Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.371381 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33","Type":"ContainerStarted","Data":"96c02d6eb935ddb3fb6da97807e766556c812c57d362a6fb5272abfb8ea7a70f"} Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.430365 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:08 crc kubenswrapper[4725]: W0225 11:13:08.440522 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f38e78f_45de_4061_8fc4_561318e984dd.slice/crio-be1a8cd8173cc8094f3f71abe141936ad3757ffbcab2ce5fc4b631eae3065fa9 WatchSource:0}: Error finding container be1a8cd8173cc8094f3f71abe141936ad3757ffbcab2ce5fc4b631eae3065fa9: Status 404 returned error can't find the container with id be1a8cd8173cc8094f3f71abe141936ad3757ffbcab2ce5fc4b631eae3065fa9 Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.588015 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84dc96ccc8-zhwrq" Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.656923 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57988f9b54-kk5lw"] Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.657130 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57988f9b54-kk5lw" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api-log" containerID="cri-o://ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340" gracePeriod=30 Feb 25 11:13:08 crc kubenswrapper[4725]: I0225 11:13:08.657495 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57988f9b54-kk5lw" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api" containerID="cri-o://6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9" gracePeriod=30 Feb 25 11:13:09 crc kubenswrapper[4725]: I0225 11:13:09.354449 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:09 crc kubenswrapper[4725]: I0225 11:13:09.396437 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" event={"ID":"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5","Type":"ContainerDied","Data":"cacef5bdb7fa98cd4e0c9cc88e4fcda5f425a2637bf2280a79a21f09a0af3323"} Feb 25 11:13:09 crc kubenswrapper[4725]: I0225 11:13:09.397535 4725 generic.go:334] "Generic (PLEG): container finished" podID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerID="cacef5bdb7fa98cd4e0c9cc88e4fcda5f425a2637bf2280a79a21f09a0af3323" exitCode=0 Feb 25 11:13:09 crc kubenswrapper[4725]: I0225 11:13:09.411420 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f38e78f-45de-4061-8fc4-561318e984dd","Type":"ContainerStarted","Data":"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f"} Feb 25 11:13:09 crc kubenswrapper[4725]: I0225 11:13:09.411465 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f38e78f-45de-4061-8fc4-561318e984dd","Type":"ContainerStarted","Data":"be1a8cd8173cc8094f3f71abe141936ad3757ffbcab2ce5fc4b631eae3065fa9"} Feb 25 11:13:09 crc kubenswrapper[4725]: I0225 11:13:09.420449 4725 generic.go:334] "Generic (PLEG): container finished" podID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerID="ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340" exitCode=143 Feb 25 11:13:09 crc kubenswrapper[4725]: I0225 11:13:09.420493 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57988f9b54-kk5lw" event={"ID":"73d640b6-86ab-4476-b233-dc7a95f5076c","Type":"ContainerDied","Data":"ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340"} Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.431733 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33","Type":"ContainerStarted","Data":"2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222"} Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.432093 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33","Type":"ContainerStarted","Data":"08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871"} Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.436350 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f38e78f-45de-4061-8fc4-561318e984dd","Type":"ContainerStarted","Data":"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c"} Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.436513 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api-log" containerID="cri-o://4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f" gracePeriod=30 Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.436552 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.436584 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api" containerID="cri-o://6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c" gracePeriod=30 Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.444789 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" event={"ID":"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5","Type":"ContainerStarted","Data":"38e21351c8c08b2e6efe54354e4ca9e6b31f36c834840273944f7410012695bb"} Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.444967 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.463870 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.6838972180000003 podStartE2EDuration="3.463852189s" podCreationTimestamp="2026-02-25 11:13:07 +0000 UTC" firstStartedPulling="2026-02-25 11:13:07.962181748 +0000 UTC m=+1213.460763783" lastFinishedPulling="2026-02-25 11:13:08.742136729 +0000 UTC m=+1214.240718754" observedRunningTime="2026-02-25 11:13:10.456858132 +0000 UTC m=+1215.955440157" watchObservedRunningTime="2026-02-25 11:13:10.463852189 +0000 UTC m=+1215.962434244" Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.480870 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.480845662 podStartE2EDuration="3.480845662s" podCreationTimestamp="2026-02-25 11:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:10.478981022 +0000 UTC m=+1215.977563067" watchObservedRunningTime="2026-02-25 11:13:10.480845662 +0000 UTC m=+1215.979427687" Feb 25 11:13:10 crc kubenswrapper[4725]: I0225 11:13:10.505053 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" podStartSLOduration=3.505036007 podStartE2EDuration="3.505036007s" podCreationTimestamp="2026-02-25 11:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:10.500903457 +0000 UTC m=+1215.999485492" watchObservedRunningTime="2026-02-25 11:13:10.505036007 +0000 UTC m=+1216.003618042" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.167215 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.297496 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data-custom\") pod \"1f38e78f-45de-4061-8fc4-561318e984dd\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.297897 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f38e78f-45de-4061-8fc4-561318e984dd-etc-machine-id\") pod \"1f38e78f-45de-4061-8fc4-561318e984dd\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.297976 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data\") pod \"1f38e78f-45de-4061-8fc4-561318e984dd\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.298028 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-scripts\") pod \"1f38e78f-45de-4061-8fc4-561318e984dd\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.298041 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f38e78f-45de-4061-8fc4-561318e984dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1f38e78f-45de-4061-8fc4-561318e984dd" (UID: "1f38e78f-45de-4061-8fc4-561318e984dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.298089 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f38e78f-45de-4061-8fc4-561318e984dd-logs\") pod \"1f38e78f-45de-4061-8fc4-561318e984dd\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.298146 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5phc4\" (UniqueName: \"kubernetes.io/projected/1f38e78f-45de-4061-8fc4-561318e984dd-kube-api-access-5phc4\") pod \"1f38e78f-45de-4061-8fc4-561318e984dd\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.298238 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-combined-ca-bundle\") pod \"1f38e78f-45de-4061-8fc4-561318e984dd\" (UID: \"1f38e78f-45de-4061-8fc4-561318e984dd\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.298677 4725 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f38e78f-45de-4061-8fc4-561318e984dd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.298757 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f38e78f-45de-4061-8fc4-561318e984dd-logs" (OuterVolumeSpecName: "logs") pod "1f38e78f-45de-4061-8fc4-561318e984dd" (UID: "1f38e78f-45de-4061-8fc4-561318e984dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.301371 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.303211 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f38e78f-45de-4061-8fc4-561318e984dd" (UID: "1f38e78f-45de-4061-8fc4-561318e984dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.303316 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f38e78f-45de-4061-8fc4-561318e984dd-kube-api-access-5phc4" (OuterVolumeSpecName: "kube-api-access-5phc4") pod "1f38e78f-45de-4061-8fc4-561318e984dd" (UID: "1f38e78f-45de-4061-8fc4-561318e984dd"). InnerVolumeSpecName "kube-api-access-5phc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.304714 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-scripts" (OuterVolumeSpecName: "scripts") pod "1f38e78f-45de-4061-8fc4-561318e984dd" (UID: "1f38e78f-45de-4061-8fc4-561318e984dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.329986 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f38e78f-45de-4061-8fc4-561318e984dd" (UID: "1f38e78f-45de-4061-8fc4-561318e984dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.366288 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data" (OuterVolumeSpecName: "config-data") pod "1f38e78f-45de-4061-8fc4-561318e984dd" (UID: "1f38e78f-45de-4061-8fc4-561318e984dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.399664 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mccc5\" (UniqueName: \"kubernetes.io/projected/7492d83b-6fd0-420c-99a5-19caedc41981-kube-api-access-mccc5\") pod \"7492d83b-6fd0-420c-99a5-19caedc41981\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.399731 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-sg-core-conf-yaml\") pod \"7492d83b-6fd0-420c-99a5-19caedc41981\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.399795 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-run-httpd\") pod \"7492d83b-6fd0-420c-99a5-19caedc41981\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.399857 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-combined-ca-bundle\") pod \"7492d83b-6fd0-420c-99a5-19caedc41981\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.399990 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-log-httpd\") pod \"7492d83b-6fd0-420c-99a5-19caedc41981\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400014 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-scripts\") pod \"7492d83b-6fd0-420c-99a5-19caedc41981\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400041 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-config-data\") pod \"7492d83b-6fd0-420c-99a5-19caedc41981\" (UID: \"7492d83b-6fd0-420c-99a5-19caedc41981\") " Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400255 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7492d83b-6fd0-420c-99a5-19caedc41981" (UID: "7492d83b-6fd0-420c-99a5-19caedc41981"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400474 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7492d83b-6fd0-420c-99a5-19caedc41981" (UID: "7492d83b-6fd0-420c-99a5-19caedc41981"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400563 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f38e78f-45de-4061-8fc4-561318e984dd-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400583 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400596 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5phc4\" (UniqueName: \"kubernetes.io/projected/1f38e78f-45de-4061-8fc4-561318e984dd-kube-api-access-5phc4\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400608 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400619 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7492d83b-6fd0-420c-99a5-19caedc41981-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400629 4725 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400639 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.400649 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f38e78f-45de-4061-8fc4-561318e984dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.406974 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-scripts" (OuterVolumeSpecName: "scripts") pod "7492d83b-6fd0-420c-99a5-19caedc41981" (UID: "7492d83b-6fd0-420c-99a5-19caedc41981"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.407114 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7492d83b-6fd0-420c-99a5-19caedc41981-kube-api-access-mccc5" (OuterVolumeSpecName: "kube-api-access-mccc5") pod "7492d83b-6fd0-420c-99a5-19caedc41981" (UID: "7492d83b-6fd0-420c-99a5-19caedc41981"). InnerVolumeSpecName "kube-api-access-mccc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.430184 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7492d83b-6fd0-420c-99a5-19caedc41981" (UID: "7492d83b-6fd0-420c-99a5-19caedc41981"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.458072 4725 generic.go:334] "Generic (PLEG): container finished" podID="7492d83b-6fd0-420c-99a5-19caedc41981" containerID="e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c" exitCode=0 Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.458133 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerDied","Data":"e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c"} Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.458162 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7492d83b-6fd0-420c-99a5-19caedc41981","Type":"ContainerDied","Data":"c21c2e771659702c09d639abb64e9910250290abffeef19b19a4278339d519d0"} Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.458182 4725 scope.go:117] "RemoveContainer" containerID="f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.458323 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.473043 4725 generic.go:334] "Generic (PLEG): container finished" podID="1f38e78f-45de-4061-8fc4-561318e984dd" containerID="6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c" exitCode=0 Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.473088 4725 generic.go:334] "Generic (PLEG): container finished" podID="1f38e78f-45de-4061-8fc4-561318e984dd" containerID="4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f" exitCode=143 Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.474086 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.477379 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f38e78f-45de-4061-8fc4-561318e984dd","Type":"ContainerDied","Data":"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c"} Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.477429 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f38e78f-45de-4061-8fc4-561318e984dd","Type":"ContainerDied","Data":"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f"} Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.477442 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f38e78f-45de-4061-8fc4-561318e984dd","Type":"ContainerDied","Data":"be1a8cd8173cc8094f3f71abe141936ad3757ffbcab2ce5fc4b631eae3065fa9"} Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.491372 4725 scope.go:117] "RemoveContainer" containerID="c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.492367 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-config-data" (OuterVolumeSpecName: "config-data") pod "7492d83b-6fd0-420c-99a5-19caedc41981" (UID: "7492d83b-6fd0-420c-99a5-19caedc41981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.500036 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7492d83b-6fd0-420c-99a5-19caedc41981" (UID: "7492d83b-6fd0-420c-99a5-19caedc41981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.504621 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.504662 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.504677 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mccc5\" (UniqueName: \"kubernetes.io/projected/7492d83b-6fd0-420c-99a5-19caedc41981-kube-api-access-mccc5\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.504691 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.504703 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7492d83b-6fd0-420c-99a5-19caedc41981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.523522 4725 scope.go:117] "RemoveContainer" containerID="e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.526911 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.537905 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.556634 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.556687 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562363 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.562671 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562686 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.562705 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="proxy-httpd" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562712 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="proxy-httpd" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.562725 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api-log" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562732 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api-log" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.562741 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="ceilometer-notification-agent" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562747 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="ceilometer-notification-agent" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.562760 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="sg-core" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562766 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="sg-core" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562946 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="ceilometer-notification-agent" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562958 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api-log" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562969 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="proxy-httpd" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562980 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" containerName="cinder-api" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.562992 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" containerName="sg-core" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.563848 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.570600 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.571364 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.571572 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.571684 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.605595 4725 scope.go:117] "RemoveContainer" containerID="f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.623947 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814\": container with ID starting with f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814 not found: ID does not exist" containerID="f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.624686 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814"} err="failed to get container status \"f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814\": rpc error: code = NotFound desc = could not find container \"f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814\": container with ID starting with f0548c3ef47a81a1e9fa7cf0d1973980873bac4c22df1e72af37f395e2f33814 not found: ID does not exist" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.624803 4725 scope.go:117] "RemoveContainer" containerID="c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.635506 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef\": container with ID starting with c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef not found: ID does not exist" containerID="c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.635570 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef"} err="failed to get container status \"c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef\": rpc error: code = NotFound desc = could not find container \"c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef\": container with ID starting with c04a97aa274eaad1e788c422c35d1110518b68c02acbf6085e29e5f66564c7ef not found: ID does not exist" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.635603 4725 scope.go:117] "RemoveContainer" containerID="e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.635993 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c\": container with ID starting with e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c not found: ID does not exist" containerID="e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.636012 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c"} err="failed to get container status \"e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c\": rpc error: code = NotFound desc = could not find container \"e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c\": container with ID starting with e7733d97f300ae2486791e598371b500216a1b62dd8e4190c9740b08e88b292c not found: ID does not exist" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.636024 4725 scope.go:117] "RemoveContainer" containerID="6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.661758 4725 scope.go:117] "RemoveContainer" containerID="4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.684101 4725 scope.go:117] "RemoveContainer" containerID="6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.684720 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c\": container with ID starting with 6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c not found: ID does not exist" containerID="6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.684925 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c"} err="failed to get container status \"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c\": rpc error: code = NotFound desc = could not find container \"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c\": container with ID starting with 6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c not found: ID does not exist" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.685009 4725 scope.go:117] "RemoveContainer" containerID="4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f" Feb 25 11:13:11 crc kubenswrapper[4725]: E0225 11:13:11.685381 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f\": container with ID starting with 4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f not found: ID does not exist" containerID="4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.685408 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f"} err="failed to get container status \"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f\": rpc error: code = NotFound desc = could not find container \"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f\": container with ID starting with 4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f not found: ID does not exist" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.685425 4725 scope.go:117] "RemoveContainer" containerID="6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.685893 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c"} err="failed to get container status \"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c\": rpc error: code = NotFound desc = could not find container \"6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c\": container with ID starting with 6e853976a627917277a089b5f8094e1bb381452b22917a9b35bc84a832da186c not found: ID does not exist" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.685989 4725 scope.go:117] "RemoveContainer" containerID="4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.686422 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f"} err="failed to get container status \"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f\": rpc error: code = NotFound desc = could not find container \"4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f\": container with ID starting with 4e54861cfffa8659f4fc3fbc9cdb09d2d4685d4ee2d7119439ddd97e50cd496f not found: ID does not exist" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.725761 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.725980 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-config-data\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.726091 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca608800-07d2-4b62-8ac2-e544a667d664-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.726152 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca608800-07d2-4b62-8ac2-e544a667d664-logs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.726209 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.726282 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.726551 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-scripts\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.726601 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.726628 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l9ql\" (UniqueName: \"kubernetes.io/projected/ca608800-07d2-4b62-8ac2-e544a667d664-kube-api-access-6l9ql\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829603 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-config-data\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829653 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca608800-07d2-4b62-8ac2-e544a667d664-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829671 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca608800-07d2-4b62-8ac2-e544a667d664-logs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829693 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829721 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829787 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-scripts\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829805 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829837 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l9ql\" (UniqueName: \"kubernetes.io/projected/ca608800-07d2-4b62-8ac2-e544a667d664-kube-api-access-6l9ql\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.829874 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.834719 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca608800-07d2-4b62-8ac2-e544a667d664-logs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.834775 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca608800-07d2-4b62-8ac2-e544a667d664-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.837718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.838325 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.849040 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.851388 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.852320 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-scripts\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.854431 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l9ql\" (UniqueName: \"kubernetes.io/projected/ca608800-07d2-4b62-8ac2-e544a667d664-kube-api-access-6l9ql\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.858696 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca608800-07d2-4b62-8ac2-e544a667d664-config-data\") pod \"cinder-api-0\" (UID: \"ca608800-07d2-4b62-8ac2-e544a667d664\") " pod="openstack/cinder-api-0" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.866569 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57988f9b54-kk5lw" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:50362->10.217.0.166:9311: read: connection reset by peer" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.866680 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57988f9b54-kk5lw" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:50346->10.217.0.166:9311: read: connection reset by peer" Feb 25 11:13:11 crc kubenswrapper[4725]: I0225 11:13:11.934672 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.093104 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:12 crc kubenswrapper[4725]: E0225 11:13:12.109516 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73d640b6_86ab_4476_b233_dc7a95f5076c.slice/crio-6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9.scope\": RecentStats: unable to find data in memory cache]" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.111889 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.121679 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.123913 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.129915 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.130079 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.140616 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.243084 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jx5n\" (UniqueName: \"kubernetes.io/projected/18ec6012-0694-4f01-a51e-709b0c6999fb-kube-api-access-7jx5n\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.243168 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-config-data\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.243187 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-log-httpd\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.243228 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-scripts\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.243253 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.243267 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-run-httpd\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.243293 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.344880 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-scripts\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.344936 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.344954 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-run-httpd\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.344979 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.345048 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jx5n\" (UniqueName: \"kubernetes.io/projected/18ec6012-0694-4f01-a51e-709b0c6999fb-kube-api-access-7jx5n\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.345089 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-config-data\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.345104 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-log-httpd\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.345769 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-log-httpd\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.345854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-run-httpd\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.350773 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.351652 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.364967 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-config-data\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.365417 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-scripts\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.368518 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jx5n\" (UniqueName: \"kubernetes.io/projected/18ec6012-0694-4f01-a51e-709b0c6999fb-kube-api-access-7jx5n\") pod \"ceilometer-0\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.422288 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.466231 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.478169 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.502724 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.510133 4725 generic.go:334] "Generic (PLEG): container finished" podID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerID="6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9" exitCode=0 Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.510182 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57988f9b54-kk5lw" event={"ID":"73d640b6-86ab-4476-b233-dc7a95f5076c","Type":"ContainerDied","Data":"6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9"} Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.510216 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57988f9b54-kk5lw" event={"ID":"73d640b6-86ab-4476-b233-dc7a95f5076c","Type":"ContainerDied","Data":"3a061ec8faa029ffe01e0dff15ae0233e578f35a3694b99e886a1b944825768e"} Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.510237 4725 scope.go:117] "RemoveContainer" containerID="6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.510375 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57988f9b54-kk5lw" Feb 25 11:13:12 crc kubenswrapper[4725]: W0225 11:13:12.510723 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca608800_07d2_4b62_8ac2_e544a667d664.slice/crio-adeb3d295895b4091f4258e89d2c6b60bc9d8c0f2d22fcc30a50451e45af63f3 WatchSource:0}: Error finding container adeb3d295895b4091f4258e89d2c6b60bc9d8c0f2d22fcc30a50451e45af63f3: Status 404 returned error can't find the container with id adeb3d295895b4091f4258e89d2c6b60bc9d8c0f2d22fcc30a50451e45af63f3 Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.530359 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.548201 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwwns\" (UniqueName: \"kubernetes.io/projected/73d640b6-86ab-4476-b233-dc7a95f5076c-kube-api-access-cwwns\") pod \"73d640b6-86ab-4476-b233-dc7a95f5076c\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.548261 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data\") pod \"73d640b6-86ab-4476-b233-dc7a95f5076c\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.548294 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-combined-ca-bundle\") pod \"73d640b6-86ab-4476-b233-dc7a95f5076c\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.548371 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d640b6-86ab-4476-b233-dc7a95f5076c-logs\") pod \"73d640b6-86ab-4476-b233-dc7a95f5076c\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.548450 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data-custom\") pod \"73d640b6-86ab-4476-b233-dc7a95f5076c\" (UID: \"73d640b6-86ab-4476-b233-dc7a95f5076c\") " Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.549254 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d640b6-86ab-4476-b233-dc7a95f5076c-logs" (OuterVolumeSpecName: "logs") pod "73d640b6-86ab-4476-b233-dc7a95f5076c" (UID: "73d640b6-86ab-4476-b233-dc7a95f5076c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.558390 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "73d640b6-86ab-4476-b233-dc7a95f5076c" (UID: "73d640b6-86ab-4476-b233-dc7a95f5076c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.558457 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d640b6-86ab-4476-b233-dc7a95f5076c-kube-api-access-cwwns" (OuterVolumeSpecName: "kube-api-access-cwwns") pod "73d640b6-86ab-4476-b233-dc7a95f5076c" (UID: "73d640b6-86ab-4476-b233-dc7a95f5076c"). InnerVolumeSpecName "kube-api-access-cwwns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.601008 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73d640b6-86ab-4476-b233-dc7a95f5076c" (UID: "73d640b6-86ab-4476-b233-dc7a95f5076c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.628011 4725 scope.go:117] "RemoveContainer" containerID="ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.628101 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data" (OuterVolumeSpecName: "config-data") pod "73d640b6-86ab-4476-b233-dc7a95f5076c" (UID: "73d640b6-86ab-4476-b233-dc7a95f5076c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.651323 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwwns\" (UniqueName: \"kubernetes.io/projected/73d640b6-86ab-4476-b233-dc7a95f5076c-kube-api-access-cwwns\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.651357 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.651366 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.651375 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73d640b6-86ab-4476-b233-dc7a95f5076c-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.651387 4725 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73d640b6-86ab-4476-b233-dc7a95f5076c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.679984 4725 scope.go:117] "RemoveContainer" containerID="6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9" Feb 25 11:13:12 crc kubenswrapper[4725]: E0225 11:13:12.682602 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9\": container with ID starting with 6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9 not found: ID does not exist" containerID="6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.682655 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9"} err="failed to get container status \"6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9\": rpc error: code = NotFound desc = could not find container \"6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9\": container with ID starting with 6914ab9e71906d0a766e5ead1e69963013599fa6de00b35ec102fd83a52a8fb9 not found: ID does not exist" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.682679 4725 scope.go:117] "RemoveContainer" containerID="ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340" Feb 25 11:13:12 crc kubenswrapper[4725]: E0225 11:13:12.685945 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340\": container with ID starting with ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340 not found: ID does not exist" containerID="ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.685982 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340"} err="failed to get container status \"ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340\": rpc error: code = NotFound desc = could not find container \"ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340\": container with ID starting with ae37e0c21ccc62a651da39e3658616df835c7cba292cd47fd878ae05ebdb3340 not found: ID does not exist" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.721900 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78c8d69889-vkkmw"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.722129 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78c8d69889-vkkmw" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-api" containerID="cri-o://aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a" gracePeriod=30 Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.722753 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78c8d69889-vkkmw" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-httpd" containerID="cri-o://434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11" gracePeriod=30 Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.750044 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58868cbfd5-pvwdv"] Feb 25 11:13:12 crc kubenswrapper[4725]: E0225 11:13:12.750394 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.750405 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api" Feb 25 11:13:12 crc kubenswrapper[4725]: E0225 11:13:12.750427 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api-log" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.750433 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api-log" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.750578 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.750595 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" containerName="barbican-api-log" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.751423 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.767709 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58868cbfd5-pvwdv"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.839247 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-78c8d69889-vkkmw" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:45232->10.217.0.159:9696: read: connection reset by peer" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.859457 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-ovndb-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.859504 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-config\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.859521 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-httpd-config\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.859567 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-internal-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.859619 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm58g\" (UniqueName: \"kubernetes.io/projected/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-kube-api-access-xm58g\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.859647 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-combined-ca-bundle\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.859702 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-public-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.867986 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57988f9b54-kk5lw"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.881813 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-57988f9b54-kk5lw"] Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.961029 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-ovndb-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.961084 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-config\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.961101 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-httpd-config\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.961148 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-internal-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.961204 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm58g\" (UniqueName: \"kubernetes.io/projected/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-kube-api-access-xm58g\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.961222 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-combined-ca-bundle\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.961277 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-public-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.967776 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-httpd-config\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.968295 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-ovndb-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.970614 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-config\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.971289 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-public-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.975349 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-combined-ca-bundle\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.981786 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm58g\" (UniqueName: \"kubernetes.io/projected/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-kube-api-access-xm58g\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:12 crc kubenswrapper[4725]: I0225 11:13:12.986485 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4971206d-e6f2-4355-8c47-9a7c9e1e51d6-internal-tls-certs\") pod \"neutron-58868cbfd5-pvwdv\" (UID: \"4971206d-e6f2-4355-8c47-9a7c9e1e51d6\") " pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.044761 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:13 crc kubenswrapper[4725]: W0225 11:13:13.053801 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18ec6012_0694_4f01_a51e_709b0c6999fb.slice/crio-faf38e93593af89e56e66026b738bd73ede546e8723ecb9786e25adf5e07ec1f WatchSource:0}: Error finding container faf38e93593af89e56e66026b738bd73ede546e8723ecb9786e25adf5e07ec1f: Status 404 returned error can't find the container with id faf38e93593af89e56e66026b738bd73ede546e8723ecb9786e25adf5e07ec1f Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.146336 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.236424 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f38e78f-45de-4061-8fc4-561318e984dd" path="/var/lib/kubelet/pods/1f38e78f-45de-4061-8fc4-561318e984dd/volumes" Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.237202 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d640b6-86ab-4476-b233-dc7a95f5076c" path="/var/lib/kubelet/pods/73d640b6-86ab-4476-b233-dc7a95f5076c/volumes" Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.237812 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7492d83b-6fd0-420c-99a5-19caedc41981" path="/var/lib/kubelet/pods/7492d83b-6fd0-420c-99a5-19caedc41981/volumes" Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.365794 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.548352 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerStarted","Data":"faf38e93593af89e56e66026b738bd73ede546e8723ecb9786e25adf5e07ec1f"} Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.550577 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca608800-07d2-4b62-8ac2-e544a667d664","Type":"ContainerStarted","Data":"4b404a9aa8597eafdac21b969052cd760d59889033a551b0489253281948022c"} Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.550604 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca608800-07d2-4b62-8ac2-e544a667d664","Type":"ContainerStarted","Data":"adeb3d295895b4091f4258e89d2c6b60bc9d8c0f2d22fcc30a50451e45af63f3"} Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.554933 4725 generic.go:334] "Generic (PLEG): container finished" podID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerID="434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11" exitCode=0 Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.554978 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c8d69889-vkkmw" event={"ID":"762b572a-f761-4bb6-8e01-8ba87c01262c","Type":"ContainerDied","Data":"434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11"} Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.755255 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58868cbfd5-pvwdv"] Feb 25 11:13:13 crc kubenswrapper[4725]: W0225 11:13:13.778814 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4971206d_e6f2_4355_8c47_9a7c9e1e51d6.slice/crio-df27477f8a147c9fab47691a2c660e0bb3284462eaea406842031e6c843b34bc WatchSource:0}: Error finding container df27477f8a147c9fab47691a2c660e0bb3284462eaea406842031e6c843b34bc: Status 404 returned error can't find the container with id df27477f8a147c9fab47691a2c660e0bb3284462eaea406842031e6c843b34bc Feb 25 11:13:13 crc kubenswrapper[4725]: I0225 11:13:13.790383 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.420704 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-78c8d69889-vkkmw" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.568945 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58868cbfd5-pvwdv" event={"ID":"4971206d-e6f2-4355-8c47-9a7c9e1e51d6","Type":"ContainerStarted","Data":"a4b789de4c208594fbeedb485b488b19a6aa2a6d74e56a0e15c176f7da9f6eba"} Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.568990 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58868cbfd5-pvwdv" event={"ID":"4971206d-e6f2-4355-8c47-9a7c9e1e51d6","Type":"ContainerStarted","Data":"f6cac9625b6262753dcf98a07fba63a162d1264223bb8d54827eba3081f25582"} Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.568999 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58868cbfd5-pvwdv" event={"ID":"4971206d-e6f2-4355-8c47-9a7c9e1e51d6","Type":"ContainerStarted","Data":"df27477f8a147c9fab47691a2c660e0bb3284462eaea406842031e6c843b34bc"} Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.569034 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.572506 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerStarted","Data":"4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a"} Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.572542 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerStarted","Data":"e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb"} Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.575545 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca608800-07d2-4b62-8ac2-e544a667d664","Type":"ContainerStarted","Data":"3e6fd9c193efc55c32fff727e3143b282ed8a23a0f57a13a892662fa3949dcc2"} Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.575706 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.592820 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58868cbfd5-pvwdv" podStartSLOduration=2.592799408 podStartE2EDuration="2.592799408s" podCreationTimestamp="2026-02-25 11:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:14.585298858 +0000 UTC m=+1220.083880903" watchObservedRunningTime="2026-02-25 11:13:14.592799408 +0000 UTC m=+1220.091381433" Feb 25 11:13:14 crc kubenswrapper[4725]: I0225 11:13:14.605877 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.605860886 podStartE2EDuration="3.605860886s" podCreationTimestamp="2026-02-25 11:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:14.604733366 +0000 UTC m=+1220.103315401" watchObservedRunningTime="2026-02-25 11:13:14.605860886 +0000 UTC m=+1220.104442921" Feb 25 11:13:15 crc kubenswrapper[4725]: I0225 11:13:15.330425 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:13:15 crc kubenswrapper[4725]: I0225 11:13:15.587506 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerStarted","Data":"d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172"} Feb 25 11:13:15 crc kubenswrapper[4725]: I0225 11:13:15.662167 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7cbf649584-gsrdx" Feb 25 11:13:15 crc kubenswrapper[4725]: I0225 11:13:15.744238 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64cd88bfbd-zxddf"] Feb 25 11:13:15 crc kubenswrapper[4725]: I0225 11:13:15.744467 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64cd88bfbd-zxddf" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon-log" containerID="cri-o://814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3" gracePeriod=30 Feb 25 11:13:15 crc kubenswrapper[4725]: I0225 11:13:15.744915 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64cd88bfbd-zxddf" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" containerID="cri-o://071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f" gracePeriod=30 Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.353014 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.453184 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-ovndb-tls-certs\") pod \"762b572a-f761-4bb6-8e01-8ba87c01262c\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.453249 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-combined-ca-bundle\") pod \"762b572a-f761-4bb6-8e01-8ba87c01262c\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.453426 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-internal-tls-certs\") pod \"762b572a-f761-4bb6-8e01-8ba87c01262c\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.453517 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brjvf\" (UniqueName: \"kubernetes.io/projected/762b572a-f761-4bb6-8e01-8ba87c01262c-kube-api-access-brjvf\") pod \"762b572a-f761-4bb6-8e01-8ba87c01262c\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.453541 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-config\") pod \"762b572a-f761-4bb6-8e01-8ba87c01262c\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.453566 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-public-tls-certs\") pod \"762b572a-f761-4bb6-8e01-8ba87c01262c\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.453619 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-httpd-config\") pod \"762b572a-f761-4bb6-8e01-8ba87c01262c\" (UID: \"762b572a-f761-4bb6-8e01-8ba87c01262c\") " Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.461979 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762b572a-f761-4bb6-8e01-8ba87c01262c-kube-api-access-brjvf" (OuterVolumeSpecName: "kube-api-access-brjvf") pod "762b572a-f761-4bb6-8e01-8ba87c01262c" (UID: "762b572a-f761-4bb6-8e01-8ba87c01262c"). InnerVolumeSpecName "kube-api-access-brjvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.463979 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "762b572a-f761-4bb6-8e01-8ba87c01262c" (UID: "762b572a-f761-4bb6-8e01-8ba87c01262c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.512136 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-config" (OuterVolumeSpecName: "config") pod "762b572a-f761-4bb6-8e01-8ba87c01262c" (UID: "762b572a-f761-4bb6-8e01-8ba87c01262c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.537113 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "762b572a-f761-4bb6-8e01-8ba87c01262c" (UID: "762b572a-f761-4bb6-8e01-8ba87c01262c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.537869 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "762b572a-f761-4bb6-8e01-8ba87c01262c" (UID: "762b572a-f761-4bb6-8e01-8ba87c01262c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.545129 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "762b572a-f761-4bb6-8e01-8ba87c01262c" (UID: "762b572a-f761-4bb6-8e01-8ba87c01262c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.549451 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "762b572a-f761-4bb6-8e01-8ba87c01262c" (UID: "762b572a-f761-4bb6-8e01-8ba87c01262c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.556701 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.556730 4725 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.556744 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.556755 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.556767 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brjvf\" (UniqueName: \"kubernetes.io/projected/762b572a-f761-4bb6-8e01-8ba87c01262c-kube-api-access-brjvf\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.556779 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.556790 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762b572a-f761-4bb6-8e01-8ba87c01262c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.618035 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.619976 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerStarted","Data":"e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca"} Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.620107 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.621641 4725 generic.go:334] "Generic (PLEG): container finished" podID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerID="aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a" exitCode=0 Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.621667 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c8d69889-vkkmw" event={"ID":"762b572a-f761-4bb6-8e01-8ba87c01262c","Type":"ContainerDied","Data":"aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a"} Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.621681 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c8d69889-vkkmw" event={"ID":"762b572a-f761-4bb6-8e01-8ba87c01262c","Type":"ContainerDied","Data":"43677c094327e83fa3b5895b3981013eb45fa730bf16bf71a81fd7832055ea35"} Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.621697 4725 scope.go:117] "RemoveContainer" containerID="434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.621785 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c8d69889-vkkmw" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.710868 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78c8d69889-vkkmw"] Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.736888 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78c8d69889-vkkmw"] Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.751539 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.067437447 podStartE2EDuration="5.751523613s" podCreationTimestamp="2026-02-25 11:13:12 +0000 UTC" firstStartedPulling="2026-02-25 11:13:13.057141222 +0000 UTC m=+1218.555723247" lastFinishedPulling="2026-02-25 11:13:16.741227378 +0000 UTC m=+1222.239809413" observedRunningTime="2026-02-25 11:13:17.731260623 +0000 UTC m=+1223.229842648" watchObservedRunningTime="2026-02-25 11:13:17.751523613 +0000 UTC m=+1223.250105638" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.753077 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-lvpd8"] Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.753298 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" podUID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerName="dnsmasq-dns" containerID="cri-o://7eaeb2c06bc7d872db3845efe5d7ead5820d5d640b75ec0c0163e14ba2294dfc" gracePeriod=10 Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.786196 4725 scope.go:117] "RemoveContainer" containerID="aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.870234 4725 scope.go:117] "RemoveContainer" containerID="434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11" Feb 25 11:13:17 crc kubenswrapper[4725]: E0225 11:13:17.871196 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11\": container with ID starting with 434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11 not found: ID does not exist" containerID="434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.871228 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11"} err="failed to get container status \"434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11\": rpc error: code = NotFound desc = could not find container \"434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11\": container with ID starting with 434654347c58f6ab89933b7b31977b95e6e4c56a1d5f60058369ff6532e7bf11 not found: ID does not exist" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.871248 4725 scope.go:117] "RemoveContainer" containerID="aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a" Feb 25 11:13:17 crc kubenswrapper[4725]: E0225 11:13:17.876983 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a\": container with ID starting with aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a not found: ID does not exist" containerID="aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a" Feb 25 11:13:17 crc kubenswrapper[4725]: I0225 11:13:17.877042 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a"} err="failed to get container status \"aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a\": rpc error: code = NotFound desc = could not find container \"aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a\": container with ID starting with aa5748617496d8cd8569fecc3ee651762f0f876410ac7b7dc7026814ed23149a not found: ID does not exist" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.023440 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.058511 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.633869 4725 generic.go:334] "Generic (PLEG): container finished" podID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerID="7eaeb2c06bc7d872db3845efe5d7ead5820d5d640b75ec0c0163e14ba2294dfc" exitCode=0 Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.633953 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" event={"ID":"0f610676-8c4b-4152-bfe0-5d1ccf467671","Type":"ContainerDied","Data":"7eaeb2c06bc7d872db3845efe5d7ead5820d5d640b75ec0c0163e14ba2294dfc"} Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.635422 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="cinder-scheduler" containerID="cri-o://08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871" gracePeriod=30 Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.635972 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="probe" containerID="cri-o://2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222" gracePeriod=30 Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.761803 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.885010 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-config\") pod \"0f610676-8c4b-4152-bfe0-5d1ccf467671\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.885102 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-svc\") pod \"0f610676-8c4b-4152-bfe0-5d1ccf467671\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.885146 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-nb\") pod \"0f610676-8c4b-4152-bfe0-5d1ccf467671\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.885186 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-swift-storage-0\") pod \"0f610676-8c4b-4152-bfe0-5d1ccf467671\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.885274 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-sb\") pod \"0f610676-8c4b-4152-bfe0-5d1ccf467671\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.885345 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c25p4\" (UniqueName: \"kubernetes.io/projected/0f610676-8c4b-4152-bfe0-5d1ccf467671-kube-api-access-c25p4\") pod \"0f610676-8c4b-4152-bfe0-5d1ccf467671\" (UID: \"0f610676-8c4b-4152-bfe0-5d1ccf467671\") " Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.899249 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f610676-8c4b-4152-bfe0-5d1ccf467671-kube-api-access-c25p4" (OuterVolumeSpecName: "kube-api-access-c25p4") pod "0f610676-8c4b-4152-bfe0-5d1ccf467671" (UID: "0f610676-8c4b-4152-bfe0-5d1ccf467671"). InnerVolumeSpecName "kube-api-access-c25p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.938702 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f610676-8c4b-4152-bfe0-5d1ccf467671" (UID: "0f610676-8c4b-4152-bfe0-5d1ccf467671"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.943954 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f610676-8c4b-4152-bfe0-5d1ccf467671" (UID: "0f610676-8c4b-4152-bfe0-5d1ccf467671"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.947592 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-config" (OuterVolumeSpecName: "config") pod "0f610676-8c4b-4152-bfe0-5d1ccf467671" (UID: "0f610676-8c4b-4152-bfe0-5d1ccf467671"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.950424 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f610676-8c4b-4152-bfe0-5d1ccf467671" (UID: "0f610676-8c4b-4152-bfe0-5d1ccf467671"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.957025 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0f610676-8c4b-4152-bfe0-5d1ccf467671" (UID: "0f610676-8c4b-4152-bfe0-5d1ccf467671"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.991523 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c25p4\" (UniqueName: \"kubernetes.io/projected/0f610676-8c4b-4152-bfe0-5d1ccf467671-kube-api-access-c25p4\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.991735 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.991794 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.991868 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.991925 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:18 crc kubenswrapper[4725]: I0225 11:13:18.992001 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f610676-8c4b-4152-bfe0-5d1ccf467671-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.235285 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" path="/var/lib/kubelet/pods/762b572a-f761-4bb6-8e01-8ba87c01262c/volumes" Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.659188 4725 generic.go:334] "Generic (PLEG): container finished" podID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerID="2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222" exitCode=0 Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.659304 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33","Type":"ContainerDied","Data":"2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222"} Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.678596 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" event={"ID":"0f610676-8c4b-4152-bfe0-5d1ccf467671","Type":"ContainerDied","Data":"809fefd63171c7e78f71d003acf2091f25ebd279492530a290f60b38b24be93f"} Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.678674 4725 scope.go:117] "RemoveContainer" containerID="7eaeb2c06bc7d872db3845efe5d7ead5820d5d640b75ec0c0163e14ba2294dfc" Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.678900 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-lvpd8" Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.688159 4725 generic.go:334] "Generic (PLEG): container finished" podID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerID="071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f" exitCode=0 Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.688255 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cd88bfbd-zxddf" event={"ID":"abad9fb0-482e-4ed1-8bf5-e738ee946358","Type":"ContainerDied","Data":"071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f"} Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.707076 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-lvpd8"] Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.715727 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-lvpd8"] Feb 25 11:13:19 crc kubenswrapper[4725]: I0225 11:13:19.717898 4725 scope.go:117] "RemoveContainer" containerID="661e3bb5faa3087c218ba03b3a0d35a4416f1e5cea7198249aedd5df77bf7178" Feb 25 11:13:20 crc kubenswrapper[4725]: I0225 11:13:20.443020 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:13:20 crc kubenswrapper[4725]: I0225 11:13:20.486361 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.234031 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f610676-8c4b-4152-bfe0-5d1ccf467671" path="/var/lib/kubelet/pods/0f610676-8c4b-4152-bfe0-5d1ccf467671/volumes" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.320662 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64cd88bfbd-zxddf" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.615819 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.715111 4725 generic.go:334] "Generic (PLEG): container finished" podID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerID="08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871" exitCode=0 Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.715159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33","Type":"ContainerDied","Data":"08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871"} Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.715190 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.715210 4725 scope.go:117] "RemoveContainer" containerID="2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.715193 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33","Type":"ContainerDied","Data":"96c02d6eb935ddb3fb6da97807e766556c812c57d362a6fb5272abfb8ea7a70f"} Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.734901 4725 scope.go:117] "RemoveContainer" containerID="08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738376 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-etc-machine-id\") pod \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738424 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-combined-ca-bundle\") pod \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738463 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data-custom\") pod \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738480 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" (UID: "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738488 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-scripts\") pod \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738550 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data\") pod \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738583 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9v5h\" (UniqueName: \"kubernetes.io/projected/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-kube-api-access-p9v5h\") pod \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\" (UID: \"7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33\") " Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.738951 4725 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.744404 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-scripts" (OuterVolumeSpecName: "scripts") pod "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" (UID: "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.746056 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-kube-api-access-p9v5h" (OuterVolumeSpecName: "kube-api-access-p9v5h") pod "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" (UID: "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33"). InnerVolumeSpecName "kube-api-access-p9v5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.750512 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" (UID: "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.829557 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" (UID: "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.845011 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.845051 4725 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.845065 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.845075 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9v5h\" (UniqueName: \"kubernetes.io/projected/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-kube-api-access-p9v5h\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.852520 4725 scope.go:117] "RemoveContainer" containerID="2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222" Feb 25 11:13:21 crc kubenswrapper[4725]: E0225 11:13:21.854262 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222\": container with ID starting with 2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222 not found: ID does not exist" containerID="2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.854337 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222"} err="failed to get container status \"2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222\": rpc error: code = NotFound desc = could not find container \"2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222\": container with ID starting with 2b709f5f4790652947341ef81ff41803aa08fa05d0398e136735363dbb68d222 not found: ID does not exist" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.854370 4725 scope.go:117] "RemoveContainer" containerID="08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871" Feb 25 11:13:21 crc kubenswrapper[4725]: E0225 11:13:21.857441 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871\": container with ID starting with 08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871 not found: ID does not exist" containerID="08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.857491 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871"} err="failed to get container status \"08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871\": rpc error: code = NotFound desc = could not find container \"08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871\": container with ID starting with 08753e811ce17fb8752a981d90e2030b55675592d97c9859bb1195f0c5d11871 not found: ID does not exist" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.895653 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data" (OuterVolumeSpecName: "config-data") pod "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" (UID: "7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:21 crc kubenswrapper[4725]: I0225 11:13:21.947279 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.058070 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.069679 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.085653 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:22 crc kubenswrapper[4725]: E0225 11:13:22.086106 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerName="init" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086123 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerName="init" Feb 25 11:13:22 crc kubenswrapper[4725]: E0225 11:13:22.086142 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="cinder-scheduler" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086149 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="cinder-scheduler" Feb 25 11:13:22 crc kubenswrapper[4725]: E0225 11:13:22.086164 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="probe" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086170 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="probe" Feb 25 11:13:22 crc kubenswrapper[4725]: E0225 11:13:22.086185 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-httpd" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086191 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-httpd" Feb 25 11:13:22 crc kubenswrapper[4725]: E0225 11:13:22.086200 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-api" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086206 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-api" Feb 25 11:13:22 crc kubenswrapper[4725]: E0225 11:13:22.086217 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerName="dnsmasq-dns" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086222 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerName="dnsmasq-dns" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086368 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="cinder-scheduler" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086382 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" containerName="probe" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086394 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-httpd" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086402 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="762b572a-f761-4bb6-8e01-8ba87c01262c" containerName="neutron-api" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.086415 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f610676-8c4b-4152-bfe0-5d1ccf467671" containerName="dnsmasq-dns" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.087309 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.091773 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.093232 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.150547 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.150599 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.150634 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.150797 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glnkd\" (UniqueName: \"kubernetes.io/projected/5a023b0b-cd51-47db-9fdf-74c673713272-kube-api-access-glnkd\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.150845 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a023b0b-cd51-47db-9fdf-74c673713272-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.151051 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.252201 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.252258 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.252344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.252454 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glnkd\" (UniqueName: \"kubernetes.io/projected/5a023b0b-cd51-47db-9fdf-74c673713272-kube-api-access-glnkd\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.252491 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a023b0b-cd51-47db-9fdf-74c673713272-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.252620 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a023b0b-cd51-47db-9fdf-74c673713272-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.252661 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.255848 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.257464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.257553 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.259637 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a023b0b-cd51-47db-9fdf-74c673713272-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.270472 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glnkd\" (UniqueName: \"kubernetes.io/projected/5a023b0b-cd51-47db-9fdf-74c673713272-kube-api-access-glnkd\") pod \"cinder-scheduler-0\" (UID: \"5a023b0b-cd51-47db-9fdf-74c673713272\") " pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.420415 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 11:13:22 crc kubenswrapper[4725]: I0225 11:13:22.911058 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 11:13:23 crc kubenswrapper[4725]: I0225 11:13:23.242231 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33" path="/var/lib/kubelet/pods/7d4e6fb0-7799-4bd3-b0d1-1a6b1c9dfe33/volumes" Feb 25 11:13:23 crc kubenswrapper[4725]: I0225 11:13:23.627766 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 25 11:13:23 crc kubenswrapper[4725]: I0225 11:13:23.766464 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a023b0b-cd51-47db-9fdf-74c673713272","Type":"ContainerStarted","Data":"607b36d3f093f0ae78e66d7ab17a61d7168b0991807475f5cc572897746e69db"} Feb 25 11:13:23 crc kubenswrapper[4725]: I0225 11:13:23.766519 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a023b0b-cd51-47db-9fdf-74c673713272","Type":"ContainerStarted","Data":"705e2c3967adce0fc7769f8e7034539d5dd3aaead1ee2b69377ddfdde5dbbe98"} Feb 25 11:13:24 crc kubenswrapper[4725]: I0225 11:13:24.731998 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7dcb568bf7-chvcs" Feb 25 11:13:24 crc kubenswrapper[4725]: I0225 11:13:24.785240 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a023b0b-cd51-47db-9fdf-74c673713272","Type":"ContainerStarted","Data":"2bfb8db39584d0b0cb45bb317a87b7fc923523589be59a072028bcadff1a0663"} Feb 25 11:13:24 crc kubenswrapper[4725]: I0225 11:13:24.806072 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:13:24 crc kubenswrapper[4725]: I0225 11:13:24.820199 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.820180013 podStartE2EDuration="2.820180013s" podCreationTimestamp="2026-02-25 11:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:24.817201603 +0000 UTC m=+1230.315783628" watchObservedRunningTime="2026-02-25 11:13:24.820180013 +0000 UTC m=+1230.318762038" Feb 25 11:13:25 crc kubenswrapper[4725]: I0225 11:13:25.030589 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69c7668f4d-s7tf6" Feb 25 11:13:25 crc kubenswrapper[4725]: I0225 11:13:25.106091 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-744d85fb8-vb847"] Feb 25 11:13:25 crc kubenswrapper[4725]: I0225 11:13:25.106345 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-744d85fb8-vb847" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-log" containerID="cri-o://195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548" gracePeriod=30 Feb 25 11:13:25 crc kubenswrapper[4725]: I0225 11:13:25.106470 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-744d85fb8-vb847" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-api" containerID="cri-o://b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c" gracePeriod=30 Feb 25 11:13:25 crc kubenswrapper[4725]: I0225 11:13:25.798126 4725 generic.go:334] "Generic (PLEG): container finished" podID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerID="195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548" exitCode=143 Feb 25 11:13:25 crc kubenswrapper[4725]: I0225 11:13:25.798188 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744d85fb8-vb847" event={"ID":"1b756696-a908-43f3-8b48-f6ceadb25bb6","Type":"ContainerDied","Data":"195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548"} Feb 25 11:13:27 crc kubenswrapper[4725]: I0225 11:13:27.421285 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.655200 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.656682 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.658766 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.658989 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.659366 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-t82t7" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.678396 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.736089 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.786954 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-openstack-config-secret\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.787131 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5w59\" (UniqueName: \"kubernetes.io/projected/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-kube-api-access-x5w59\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.787180 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-openstack-config\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.787205 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-combined-ca-bundle\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.844551 4725 generic.go:334] "Generic (PLEG): container finished" podID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerID="b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c" exitCode=0 Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.844677 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-744d85fb8-vb847" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.844726 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744d85fb8-vb847" event={"ID":"1b756696-a908-43f3-8b48-f6ceadb25bb6","Type":"ContainerDied","Data":"b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c"} Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.845045 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744d85fb8-vb847" event={"ID":"1b756696-a908-43f3-8b48-f6ceadb25bb6","Type":"ContainerDied","Data":"2b458c57fa708f7365fde91c901a5ca3811b91388df58200727794339cc2ff1d"} Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.845085 4725 scope.go:117] "RemoveContainer" containerID="b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.870580 4725 scope.go:117] "RemoveContainer" containerID="195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.888655 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-internal-tls-certs\") pod \"1b756696-a908-43f3-8b48-f6ceadb25bb6\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.888737 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-combined-ca-bundle\") pod \"1b756696-a908-43f3-8b48-f6ceadb25bb6\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.888772 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b756696-a908-43f3-8b48-f6ceadb25bb6-logs\") pod \"1b756696-a908-43f3-8b48-f6ceadb25bb6\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.888836 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-public-tls-certs\") pod \"1b756696-a908-43f3-8b48-f6ceadb25bb6\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.888994 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-scripts\") pod \"1b756696-a908-43f3-8b48-f6ceadb25bb6\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.889019 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-config-data\") pod \"1b756696-a908-43f3-8b48-f6ceadb25bb6\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.889507 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjr5w\" (UniqueName: \"kubernetes.io/projected/1b756696-a908-43f3-8b48-f6ceadb25bb6-kube-api-access-cjr5w\") pod \"1b756696-a908-43f3-8b48-f6ceadb25bb6\" (UID: \"1b756696-a908-43f3-8b48-f6ceadb25bb6\") " Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.889700 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b756696-a908-43f3-8b48-f6ceadb25bb6-logs" (OuterVolumeSpecName: "logs") pod "1b756696-a908-43f3-8b48-f6ceadb25bb6" (UID: "1b756696-a908-43f3-8b48-f6ceadb25bb6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.889737 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-openstack-config-secret\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.890169 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5w59\" (UniqueName: \"kubernetes.io/projected/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-kube-api-access-x5w59\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.890210 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-openstack-config\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.890229 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-combined-ca-bundle\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.890363 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b756696-a908-43f3-8b48-f6ceadb25bb6-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.891684 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-openstack-config\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.895244 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-combined-ca-bundle\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.898418 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b756696-a908-43f3-8b48-f6ceadb25bb6-kube-api-access-cjr5w" (OuterVolumeSpecName: "kube-api-access-cjr5w") pod "1b756696-a908-43f3-8b48-f6ceadb25bb6" (UID: "1b756696-a908-43f3-8b48-f6ceadb25bb6"). InnerVolumeSpecName "kube-api-access-cjr5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.899646 4725 scope.go:117] "RemoveContainer" containerID="b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c" Feb 25 11:13:28 crc kubenswrapper[4725]: E0225 11:13:28.901577 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c\": container with ID starting with b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c not found: ID does not exist" containerID="b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.901609 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c"} err="failed to get container status \"b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c\": rpc error: code = NotFound desc = could not find container \"b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c\": container with ID starting with b41fd30383b5dfc7189309de1c94912c705af94e814b83c0297813344853310c not found: ID does not exist" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.901630 4725 scope.go:117] "RemoveContainer" containerID="195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548" Feb 25 11:13:28 crc kubenswrapper[4725]: E0225 11:13:28.902096 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548\": container with ID starting with 195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548 not found: ID does not exist" containerID="195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.902119 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548"} err="failed to get container status \"195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548\": rpc error: code = NotFound desc = could not find container \"195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548\": container with ID starting with 195731ea56faee4bd06ce5ffb69e8cf91a38271947cf9cb74e6b299ef5c20548 not found: ID does not exist" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.915909 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-openstack-config-secret\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.919397 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5w59\" (UniqueName: \"kubernetes.io/projected/71cbcb8e-872e-48b4-93a9-f5ee2edb3746-kube-api-access-x5w59\") pod \"openstackclient\" (UID: \"71cbcb8e-872e-48b4-93a9-f5ee2edb3746\") " pod="openstack/openstackclient" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.932075 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-scripts" (OuterVolumeSpecName: "scripts") pod "1b756696-a908-43f3-8b48-f6ceadb25bb6" (UID: "1b756696-a908-43f3-8b48-f6ceadb25bb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.950757 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b756696-a908-43f3-8b48-f6ceadb25bb6" (UID: "1b756696-a908-43f3-8b48-f6ceadb25bb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.992025 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.992055 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.992068 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjr5w\" (UniqueName: \"kubernetes.io/projected/1b756696-a908-43f3-8b48-f6ceadb25bb6-kube-api-access-cjr5w\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:28 crc kubenswrapper[4725]: I0225 11:13:28.998249 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-config-data" (OuterVolumeSpecName: "config-data") pod "1b756696-a908-43f3-8b48-f6ceadb25bb6" (UID: "1b756696-a908-43f3-8b48-f6ceadb25bb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.019996 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1b756696-a908-43f3-8b48-f6ceadb25bb6" (UID: "1b756696-a908-43f3-8b48-f6ceadb25bb6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.042179 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1b756696-a908-43f3-8b48-f6ceadb25bb6" (UID: "1b756696-a908-43f3-8b48-f6ceadb25bb6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.048235 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.093630 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.093658 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.093670 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b756696-a908-43f3-8b48-f6ceadb25bb6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.184687 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-744d85fb8-vb847"] Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.191091 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-744d85fb8-vb847"] Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.235652 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" path="/var/lib/kubelet/pods/1b756696-a908-43f3-8b48-f6ceadb25bb6/volumes" Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.492782 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 11:13:29 crc kubenswrapper[4725]: I0225 11:13:29.856803 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"71cbcb8e-872e-48b4-93a9-f5ee2edb3746","Type":"ContainerStarted","Data":"df2b26e9ebc8dda5781023ad90473176e79edb9d5001447e8f7553b4efad189c"} Feb 25 11:13:31 crc kubenswrapper[4725]: I0225 11:13:31.323300 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64cd88bfbd-zxddf" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.629914 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.977088 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6d86f859c9-f94qp"] Feb 25 11:13:32 crc kubenswrapper[4725]: E0225 11:13:32.977496 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-log" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.977515 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-log" Feb 25 11:13:32 crc kubenswrapper[4725]: E0225 11:13:32.977525 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-api" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.977531 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-api" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.977683 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-log" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.977708 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b756696-a908-43f3-8b48-f6ceadb25bb6" containerName="placement-api" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.979924 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.982207 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.983010 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.986058 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 25 11:13:32 crc kubenswrapper[4725]: I0225 11:13:32.994048 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d86f859c9-f94qp"] Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.073936 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdb91fb4-91c1-4761-8724-24a845ee9d03-log-httpd\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.073996 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-combined-ca-bundle\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.074029 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-public-tls-certs\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.074357 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdb91fb4-91c1-4761-8724-24a845ee9d03-etc-swift\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.074397 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc58b\" (UniqueName: \"kubernetes.io/projected/cdb91fb4-91c1-4761-8724-24a845ee9d03-kube-api-access-bc58b\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.074568 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdb91fb4-91c1-4761-8724-24a845ee9d03-run-httpd\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.074694 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-config-data\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.074738 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-internal-tls-certs\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177558 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-combined-ca-bundle\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177639 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-public-tls-certs\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177731 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdb91fb4-91c1-4761-8724-24a845ee9d03-etc-swift\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177750 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc58b\" (UniqueName: \"kubernetes.io/projected/cdb91fb4-91c1-4761-8724-24a845ee9d03-kube-api-access-bc58b\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177776 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdb91fb4-91c1-4761-8724-24a845ee9d03-run-httpd\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177805 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-config-data\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177821 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-internal-tls-certs\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.177860 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdb91fb4-91c1-4761-8724-24a845ee9d03-log-httpd\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.178246 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdb91fb4-91c1-4761-8724-24a845ee9d03-log-httpd\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.178805 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdb91fb4-91c1-4761-8724-24a845ee9d03-run-httpd\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.185179 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-internal-tls-certs\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.185920 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-config-data\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.185949 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdb91fb4-91c1-4761-8724-24a845ee9d03-etc-swift\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.188944 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-public-tls-certs\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.191160 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb91fb4-91c1-4761-8724-24a845ee9d03-combined-ca-bundle\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.197524 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc58b\" (UniqueName: \"kubernetes.io/projected/cdb91fb4-91c1-4761-8724-24a845ee9d03-kube-api-access-bc58b\") pod \"swift-proxy-6d86f859c9-f94qp\" (UID: \"cdb91fb4-91c1-4761-8724-24a845ee9d03\") " pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:33 crc kubenswrapper[4725]: I0225 11:13:33.302678 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.416807 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.417694 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-central-agent" containerID="cri-o://e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb" gracePeriod=30 Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.418493 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="sg-core" containerID="cri-o://d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172" gracePeriod=30 Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.418524 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="proxy-httpd" containerID="cri-o://e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca" gracePeriod=30 Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.418525 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-notification-agent" containerID="cri-o://4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a" gracePeriod=30 Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.426074 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.932608 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"71cbcb8e-872e-48b4-93a9-f5ee2edb3746","Type":"ContainerStarted","Data":"cd9b1820f24374029b843e77df67874cd3da1c2f9a9d455ece044ff40375fbd7"} Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.936398 4725 generic.go:334] "Generic (PLEG): container finished" podID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerID="e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca" exitCode=0 Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.936447 4725 generic.go:334] "Generic (PLEG): container finished" podID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerID="d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172" exitCode=2 Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.936455 4725 generic.go:334] "Generic (PLEG): container finished" podID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerID="e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb" exitCode=0 Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.936471 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerDied","Data":"e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca"} Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.936488 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerDied","Data":"d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172"} Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.936499 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerDied","Data":"e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb"} Feb 25 11:13:37 crc kubenswrapper[4725]: I0225 11:13:37.955454 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.8551191949999999 podStartE2EDuration="9.955434022s" podCreationTimestamp="2026-02-25 11:13:28 +0000 UTC" firstStartedPulling="2026-02-25 11:13:29.508333665 +0000 UTC m=+1235.006915690" lastFinishedPulling="2026-02-25 11:13:37.608648492 +0000 UTC m=+1243.107230517" observedRunningTime="2026-02-25 11:13:37.949361129 +0000 UTC m=+1243.447943154" watchObservedRunningTime="2026-02-25 11:13:37.955434022 +0000 UTC m=+1243.454016037" Feb 25 11:13:38 crc kubenswrapper[4725]: W0225 11:13:38.062055 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb91fb4_91c1_4761_8724_24a845ee9d03.slice/crio-58c1b6f17ccd43e483e99eb70ebcd76f63301cafa04e8fe19c00c2b991e942a0 WatchSource:0}: Error finding container 58c1b6f17ccd43e483e99eb70ebcd76f63301cafa04e8fe19c00c2b991e942a0: Status 404 returned error can't find the container with id 58c1b6f17ccd43e483e99eb70ebcd76f63301cafa04e8fe19c00c2b991e942a0 Feb 25 11:13:38 crc kubenswrapper[4725]: I0225 11:13:38.064036 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d86f859c9-f94qp"] Feb 25 11:13:38 crc kubenswrapper[4725]: I0225 11:13:38.949445 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d86f859c9-f94qp" event={"ID":"cdb91fb4-91c1-4761-8724-24a845ee9d03","Type":"ContainerStarted","Data":"d7d273909bee974194d70b021a08cf1a50ed14597eb0a39e630162fed7f5dde9"} Feb 25 11:13:38 crc kubenswrapper[4725]: I0225 11:13:38.949724 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d86f859c9-f94qp" event={"ID":"cdb91fb4-91c1-4761-8724-24a845ee9d03","Type":"ContainerStarted","Data":"f6d66207a408e3304f21049c5e816212a17b321b02b9f8b0fe7bfd83fdf20dcc"} Feb 25 11:13:38 crc kubenswrapper[4725]: I0225 11:13:38.949739 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d86f859c9-f94qp" event={"ID":"cdb91fb4-91c1-4761-8724-24a845ee9d03","Type":"ContainerStarted","Data":"58c1b6f17ccd43e483e99eb70ebcd76f63301cafa04e8fe19c00c2b991e942a0"} Feb 25 11:13:38 crc kubenswrapper[4725]: I0225 11:13:38.949770 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:38 crc kubenswrapper[4725]: I0225 11:13:38.949789 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:38 crc kubenswrapper[4725]: I0225 11:13:38.985643 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6d86f859c9-f94qp" podStartSLOduration=6.98562054 podStartE2EDuration="6.98562054s" podCreationTimestamp="2026-02-25 11:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:38.971251084 +0000 UTC m=+1244.469833119" watchObservedRunningTime="2026-02-25 11:13:38.98562054 +0000 UTC m=+1244.484202585" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.320229 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64cd88bfbd-zxddf" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.320918 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.555560 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.555945 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.555993 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.556766 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9d1cf00d5958f238b464e2eb2f371e000d949ef3901a3f7ece30337723bea95"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.556847 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://e9d1cf00d5958f238b464e2eb2f371e000d949ef3901a3f7ece30337723bea95" gracePeriod=600 Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.617725 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.731192 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-log-httpd\") pod \"18ec6012-0694-4f01-a51e-709b0c6999fb\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.731280 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-scripts\") pod \"18ec6012-0694-4f01-a51e-709b0c6999fb\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.731738 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18ec6012-0694-4f01-a51e-709b0c6999fb" (UID: "18ec6012-0694-4f01-a51e-709b0c6999fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.732203 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-run-httpd\") pod \"18ec6012-0694-4f01-a51e-709b0c6999fb\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.732309 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-sg-core-conf-yaml\") pod \"18ec6012-0694-4f01-a51e-709b0c6999fb\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.732351 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jx5n\" (UniqueName: \"kubernetes.io/projected/18ec6012-0694-4f01-a51e-709b0c6999fb-kube-api-access-7jx5n\") pod \"18ec6012-0694-4f01-a51e-709b0c6999fb\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.732373 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-config-data\") pod \"18ec6012-0694-4f01-a51e-709b0c6999fb\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.732391 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-combined-ca-bundle\") pod \"18ec6012-0694-4f01-a51e-709b0c6999fb\" (UID: \"18ec6012-0694-4f01-a51e-709b0c6999fb\") " Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.732809 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.733381 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18ec6012-0694-4f01-a51e-709b0c6999fb" (UID: "18ec6012-0694-4f01-a51e-709b0c6999fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.737567 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-scripts" (OuterVolumeSpecName: "scripts") pod "18ec6012-0694-4f01-a51e-709b0c6999fb" (UID: "18ec6012-0694-4f01-a51e-709b0c6999fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.740058 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ec6012-0694-4f01-a51e-709b0c6999fb-kube-api-access-7jx5n" (OuterVolumeSpecName: "kube-api-access-7jx5n") pod "18ec6012-0694-4f01-a51e-709b0c6999fb" (UID: "18ec6012-0694-4f01-a51e-709b0c6999fb"). InnerVolumeSpecName "kube-api-access-7jx5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.763268 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18ec6012-0694-4f01-a51e-709b0c6999fb" (UID: "18ec6012-0694-4f01-a51e-709b0c6999fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.823071 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18ec6012-0694-4f01-a51e-709b0c6999fb" (UID: "18ec6012-0694-4f01-a51e-709b0c6999fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.833668 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ec6012-0694-4f01-a51e-709b0c6999fb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.833712 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.833725 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jx5n\" (UniqueName: \"kubernetes.io/projected/18ec6012-0694-4f01-a51e-709b0c6999fb-kube-api-access-7jx5n\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.833733 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.833741 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.833687 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-config-data" (OuterVolumeSpecName: "config-data") pod "18ec6012-0694-4f01-a51e-709b0c6999fb" (UID: "18ec6012-0694-4f01-a51e-709b0c6999fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.934680 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ec6012-0694-4f01-a51e-709b0c6999fb-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.977773 4725 generic.go:334] "Generic (PLEG): container finished" podID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerID="4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a" exitCode=0 Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.977883 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.977967 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerDied","Data":"4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a"} Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.978003 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ec6012-0694-4f01-a51e-709b0c6999fb","Type":"ContainerDied","Data":"faf38e93593af89e56e66026b738bd73ede546e8723ecb9786e25adf5e07ec1f"} Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.978023 4725 scope.go:117] "RemoveContainer" containerID="e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca" Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.981570 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="e9d1cf00d5958f238b464e2eb2f371e000d949ef3901a3f7ece30337723bea95" exitCode=0 Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.981610 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"e9d1cf00d5958f238b464e2eb2f371e000d949ef3901a3f7ece30337723bea95"} Feb 25 11:13:41 crc kubenswrapper[4725]: I0225 11:13:41.997797 4725 scope.go:117] "RemoveContainer" containerID="d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.023640 4725 scope.go:117] "RemoveContainer" containerID="4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.034484 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.046966 4725 scope.go:117] "RemoveContainer" containerID="e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.047822 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.058803 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.059305 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="proxy-httpd" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059328 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="proxy-httpd" Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.059342 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="sg-core" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059349 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="sg-core" Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.059369 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-notification-agent" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059379 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-notification-agent" Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.059404 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-central-agent" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059411 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-central-agent" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059603 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="sg-core" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059617 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-central-agent" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059640 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="ceilometer-notification-agent" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.059658 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" containerName="proxy-httpd" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.061482 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.066140 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.066513 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.071599 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.089199 4725 scope.go:117] "RemoveContainer" containerID="e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca" Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.089820 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca\": container with ID starting with e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca not found: ID does not exist" containerID="e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.089967 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca"} err="failed to get container status \"e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca\": rpc error: code = NotFound desc = could not find container \"e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca\": container with ID starting with e855d5c8b3a12c9c716dd5078955d7d8a6e5f8973ed264610ba2bd2275e9d1ca not found: ID does not exist" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.090072 4725 scope.go:117] "RemoveContainer" containerID="d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172" Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.090466 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172\": container with ID starting with d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172 not found: ID does not exist" containerID="d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.090510 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172"} err="failed to get container status \"d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172\": rpc error: code = NotFound desc = could not find container \"d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172\": container with ID starting with d0f4b66bfa7d9634b807d1e06fd15a7210dba3ce510a4cba0e1fce541d59a172 not found: ID does not exist" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.090538 4725 scope.go:117] "RemoveContainer" containerID="4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a" Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.090993 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a\": container with ID starting with 4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a not found: ID does not exist" containerID="4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.091087 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a"} err="failed to get container status \"4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a\": rpc error: code = NotFound desc = could not find container \"4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a\": container with ID starting with 4837e23dfd07024413cbe3d450b3d820342fb3e7ea4048ac8bebafdd8df5ea4a not found: ID does not exist" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.091598 4725 scope.go:117] "RemoveContainer" containerID="e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb" Feb 25 11:13:42 crc kubenswrapper[4725]: E0225 11:13:42.092169 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb\": container with ID starting with e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb not found: ID does not exist" containerID="e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.092213 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb"} err="failed to get container status \"e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb\": rpc error: code = NotFound desc = could not find container \"e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb\": container with ID starting with e2f60ee718fae0ecfd1f6d98eb169ab3e60fc7767278dc553f0f8b5adcc69dfb not found: ID does not exist" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.092241 4725 scope.go:117] "RemoveContainer" containerID="7caa77cf5b27b9b598253176495f0fa2415fb90743494a0dd02b8750c84c33d8" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.239560 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ndv\" (UniqueName: \"kubernetes.io/projected/1a344b84-2809-4ecd-87eb-2381acb5c9d8-kube-api-access-q7ndv\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.239864 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-log-httpd\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.239996 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-run-httpd\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.240051 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.240192 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-config-data\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.240292 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.240392 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-scripts\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.342151 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ndv\" (UniqueName: \"kubernetes.io/projected/1a344b84-2809-4ecd-87eb-2381acb5c9d8-kube-api-access-q7ndv\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.342226 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-log-httpd\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.342248 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-run-httpd\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.342267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.342357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-config-data\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.342392 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.342417 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-scripts\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.343683 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-log-httpd\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.344422 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-run-httpd\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.346347 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.346789 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-scripts\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.346965 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-config-data\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.349610 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.361009 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ndv\" (UniqueName: \"kubernetes.io/projected/1a344b84-2809-4ecd-87eb-2381acb5c9d8-kube-api-access-q7ndv\") pod \"ceilometer-0\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.383420 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.798989 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fmmml"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.800365 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.815807 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fmmml"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.878146 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.894675 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-87xvm"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.895661 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.908703 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-87xvm"] Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.952940 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac81c472-c14e-4190-a40d-ed4a19e13dd7-operator-scripts\") pod \"nova-api-db-create-fmmml\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.953064 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpjhw\" (UniqueName: \"kubernetes.io/projected/ac81c472-c14e-4190-a40d-ed4a19e13dd7-kube-api-access-cpjhw\") pod \"nova-api-db-create-fmmml\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.990731 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerStarted","Data":"757eeda42b552cbae2dacd6f0ac4c4c44bcd08959a36ef9546590981815b5cbc"} Feb 25 11:13:42 crc kubenswrapper[4725]: I0225 11:13:42.994557 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"11e1b1cdb4e476cda22a21020fd383eb9bc627ad8cf9f3e9b918adf3b517b8b4"} Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.007417 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-48b7w"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.008695 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.020606 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-725d-account-create-update-kbmpr"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.021759 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.023458 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.030477 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-48b7w"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.055283 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-725d-account-create-update-kbmpr"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.062757 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfwll\" (UniqueName: \"kubernetes.io/projected/97586ed7-2c87-4ebc-946e-56e4fab86e31-kube-api-access-wfwll\") pod \"nova-cell0-db-create-87xvm\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.063107 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpjhw\" (UniqueName: \"kubernetes.io/projected/ac81c472-c14e-4190-a40d-ed4a19e13dd7-kube-api-access-cpjhw\") pod \"nova-api-db-create-fmmml\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.063386 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97586ed7-2c87-4ebc-946e-56e4fab86e31-operator-scripts\") pod \"nova-cell0-db-create-87xvm\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.063550 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac81c472-c14e-4190-a40d-ed4a19e13dd7-operator-scripts\") pod \"nova-api-db-create-fmmml\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.064439 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac81c472-c14e-4190-a40d-ed4a19e13dd7-operator-scripts\") pod \"nova-api-db-create-fmmml\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.086558 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpjhw\" (UniqueName: \"kubernetes.io/projected/ac81c472-c14e-4190-a40d-ed4a19e13dd7-kube-api-access-cpjhw\") pod \"nova-api-db-create-fmmml\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.124203 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.149341 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.166927 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15c22dd9-63a4-44f0-a275-bd8d6415fff1-operator-scripts\") pod \"nova-cell1-db-create-48b7w\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.166979 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zgwm\" (UniqueName: \"kubernetes.io/projected/15c22dd9-63a4-44f0-a275-bd8d6415fff1-kube-api-access-6zgwm\") pod \"nova-cell1-db-create-48b7w\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.167029 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfwll\" (UniqueName: \"kubernetes.io/projected/97586ed7-2c87-4ebc-946e-56e4fab86e31-kube-api-access-wfwll\") pod \"nova-cell0-db-create-87xvm\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.167127 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b049d6-afa5-49eb-8bef-64de2f0672b5-operator-scripts\") pod \"nova-api-725d-account-create-update-kbmpr\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.167149 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89zd2\" (UniqueName: \"kubernetes.io/projected/e8b049d6-afa5-49eb-8bef-64de2f0672b5-kube-api-access-89zd2\") pod \"nova-api-725d-account-create-update-kbmpr\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.167264 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97586ed7-2c87-4ebc-946e-56e4fab86e31-operator-scripts\") pod \"nova-cell0-db-create-87xvm\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.168082 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97586ed7-2c87-4ebc-946e-56e4fab86e31-operator-scripts\") pod \"nova-cell0-db-create-87xvm\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.188582 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.188805 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b330a7b3-8fd7-4db6-8d82-257570b2bd58" containerName="kube-state-metrics" containerID="cri-o://62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de" gracePeriod=30 Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.194044 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfwll\" (UniqueName: \"kubernetes.io/projected/97586ed7-2c87-4ebc-946e-56e4fab86e31-kube-api-access-wfwll\") pod \"nova-cell0-db-create-87xvm\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.219059 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58868cbfd5-pvwdv" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.219151 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5c48-account-create-update-pd8sb"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.221140 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.224532 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.240726 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.282516 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/967eb016-3ed0-4d88-a839-e753c7a6e9a5-operator-scripts\") pod \"nova-cell0-5c48-account-create-update-pd8sb\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.286071 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ec6012-0694-4f01-a51e-709b0c6999fb" path="/var/lib/kubelet/pods/18ec6012-0694-4f01-a51e-709b0c6999fb/volumes" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.287772 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15c22dd9-63a4-44f0-a275-bd8d6415fff1-operator-scripts\") pod \"nova-cell1-db-create-48b7w\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.287819 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wncw4\" (UniqueName: \"kubernetes.io/projected/967eb016-3ed0-4d88-a839-e753c7a6e9a5-kube-api-access-wncw4\") pod \"nova-cell0-5c48-account-create-update-pd8sb\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.287875 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zgwm\" (UniqueName: \"kubernetes.io/projected/15c22dd9-63a4-44f0-a275-bd8d6415fff1-kube-api-access-6zgwm\") pod \"nova-cell1-db-create-48b7w\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.288074 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b049d6-afa5-49eb-8bef-64de2f0672b5-operator-scripts\") pod \"nova-api-725d-account-create-update-kbmpr\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.288099 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89zd2\" (UniqueName: \"kubernetes.io/projected/e8b049d6-afa5-49eb-8bef-64de2f0672b5-kube-api-access-89zd2\") pod \"nova-api-725d-account-create-update-kbmpr\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.289733 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b049d6-afa5-49eb-8bef-64de2f0672b5-operator-scripts\") pod \"nova-api-725d-account-create-update-kbmpr\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.289957 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15c22dd9-63a4-44f0-a275-bd8d6415fff1-operator-scripts\") pod \"nova-cell1-db-create-48b7w\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.292988 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5c48-account-create-update-pd8sb"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.321632 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zgwm\" (UniqueName: \"kubernetes.io/projected/15c22dd9-63a4-44f0-a275-bd8d6415fff1-kube-api-access-6zgwm\") pod \"nova-cell1-db-create-48b7w\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.325331 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.331600 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89zd2\" (UniqueName: \"kubernetes.io/projected/e8b049d6-afa5-49eb-8bef-64de2f0672b5-kube-api-access-89zd2\") pod \"nova-api-725d-account-create-update-kbmpr\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.339488 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b9448d47d-2x4vh"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.339925 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b9448d47d-2x4vh" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-api" containerID="cri-o://0e837a9df0516ec462c52022c8a572fd94d7b77ba861d5ed648de97086ec1d9b" gracePeriod=30 Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.340060 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b9448d47d-2x4vh" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-httpd" containerID="cri-o://8ee55fe701e26882854163396a4b7c2ce444570c5c52397ccaddebebdaabb7ba" gracePeriod=30 Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.344177 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.347499 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.349920 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d86f859c9-f94qp" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.393911 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/967eb016-3ed0-4d88-a839-e753c7a6e9a5-operator-scripts\") pod \"nova-cell0-5c48-account-create-update-pd8sb\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.393978 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wncw4\" (UniqueName: \"kubernetes.io/projected/967eb016-3ed0-4d88-a839-e753c7a6e9a5-kube-api-access-wncw4\") pod \"nova-cell0-5c48-account-create-update-pd8sb\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.395048 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/967eb016-3ed0-4d88-a839-e753c7a6e9a5-operator-scripts\") pod \"nova-cell0-5c48-account-create-update-pd8sb\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.420541 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wncw4\" (UniqueName: \"kubernetes.io/projected/967eb016-3ed0-4d88-a839-e753c7a6e9a5-kube-api-access-wncw4\") pod \"nova-cell0-5c48-account-create-update-pd8sb\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.427701 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ce2a-account-create-update-4ls5m"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.429440 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.433629 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.440161 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ce2a-account-create-update-4ls5m"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.608000 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-operator-scripts\") pod \"nova-cell1-ce2a-account-create-update-4ls5m\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.608359 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rjr2\" (UniqueName: \"kubernetes.io/projected/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-kube-api-access-4rjr2\") pod \"nova-cell1-ce2a-account-create-update-4ls5m\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.609648 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.710016 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rjr2\" (UniqueName: \"kubernetes.io/projected/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-kube-api-access-4rjr2\") pod \"nova-cell1-ce2a-account-create-update-4ls5m\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.710354 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-operator-scripts\") pod \"nova-cell1-ce2a-account-create-update-4ls5m\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.711263 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-operator-scripts\") pod \"nova-cell1-ce2a-account-create-update-4ls5m\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.730544 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rjr2\" (UniqueName: \"kubernetes.io/projected/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-kube-api-access-4rjr2\") pod \"nova-cell1-ce2a-account-create-update-4ls5m\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.753257 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.798136 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fmmml"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.986949 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-87xvm"] Feb 25 11:13:43 crc kubenswrapper[4725]: I0225 11:13:43.994632 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-48b7w"] Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.033898 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.038531 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-87xvm" event={"ID":"97586ed7-2c87-4ebc-946e-56e4fab86e31","Type":"ContainerStarted","Data":"9f727c3420bcfde486822fe6730d4cebb76891372f0324df33e83d7ff6050914"} Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.045411 4725 generic.go:334] "Generic (PLEG): container finished" podID="b330a7b3-8fd7-4db6-8d82-257570b2bd58" containerID="62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de" exitCode=2 Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.045499 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b330a7b3-8fd7-4db6-8d82-257570b2bd58","Type":"ContainerDied","Data":"62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de"} Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.045526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b330a7b3-8fd7-4db6-8d82-257570b2bd58","Type":"ContainerDied","Data":"3847786803addb9937455ea335a5be3d8ba93568a72f7fb0c15d048abe56b0e5"} Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.045548 4725 scope.go:117] "RemoveContainer" containerID="62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.045682 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.063394 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fmmml" event={"ID":"ac81c472-c14e-4190-a40d-ed4a19e13dd7","Type":"ContainerStarted","Data":"cc3d4b75378cf9d8a8fc697037261bbe307b342a7b70ee19180ae73a6b9f52d7"} Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.083486 4725 generic.go:334] "Generic (PLEG): container finished" podID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerID="8ee55fe701e26882854163396a4b7c2ce444570c5c52397ccaddebebdaabb7ba" exitCode=0 Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.084879 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9448d47d-2x4vh" event={"ID":"36f15650-4f16-4e3b-94cf-a80bcb7c3fde","Type":"ContainerDied","Data":"8ee55fe701e26882854163396a4b7c2ce444570c5c52397ccaddebebdaabb7ba"} Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.137316 4725 scope.go:117] "RemoveContainer" containerID="62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de" Feb 25 11:13:44 crc kubenswrapper[4725]: E0225 11:13:44.138203 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de\": container with ID starting with 62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de not found: ID does not exist" containerID="62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.138253 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de"} err="failed to get container status \"62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de\": rpc error: code = NotFound desc = could not find container \"62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de\": container with ID starting with 62bcb863db8409836fd857edd713ad961569f463d1e5e73bdb2a4b1ba70f68de not found: ID does not exist" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.222343 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-725d-account-create-update-kbmpr"] Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.227580 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f98hm\" (UniqueName: \"kubernetes.io/projected/b330a7b3-8fd7-4db6-8d82-257570b2bd58-kube-api-access-f98hm\") pod \"b330a7b3-8fd7-4db6-8d82-257570b2bd58\" (UID: \"b330a7b3-8fd7-4db6-8d82-257570b2bd58\") " Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.238969 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b330a7b3-8fd7-4db6-8d82-257570b2bd58-kube-api-access-f98hm" (OuterVolumeSpecName: "kube-api-access-f98hm") pod "b330a7b3-8fd7-4db6-8d82-257570b2bd58" (UID: "b330a7b3-8fd7-4db6-8d82-257570b2bd58"). InnerVolumeSpecName "kube-api-access-f98hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.330553 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f98hm\" (UniqueName: \"kubernetes.io/projected/b330a7b3-8fd7-4db6-8d82-257570b2bd58-kube-api-access-f98hm\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.383904 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ce2a-account-create-update-4ls5m"] Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.397260 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5c48-account-create-update-pd8sb"] Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.434072 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.488501 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.549009 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:13:44 crc kubenswrapper[4725]: E0225 11:13:44.549557 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b330a7b3-8fd7-4db6-8d82-257570b2bd58" containerName="kube-state-metrics" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.549621 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b330a7b3-8fd7-4db6-8d82-257570b2bd58" containerName="kube-state-metrics" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.549884 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b330a7b3-8fd7-4db6-8d82-257570b2bd58" containerName="kube-state-metrics" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.550769 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.556839 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.557169 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.594945 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.739341 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.739644 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.739931 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.740232 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9gx\" (UniqueName: \"kubernetes.io/projected/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-api-access-4b9gx\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.846511 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.847234 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9gx\" (UniqueName: \"kubernetes.io/projected/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-api-access-4b9gx\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.847350 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.847395 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.854123 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.854929 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.865109 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9gx\" (UniqueName: \"kubernetes.io/projected/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-api-access-4b9gx\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.874852 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0e72df9-3fcc-4373-b1af-fac9d1bc5e99-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99\") " pod="openstack/kube-state-metrics-0" Feb 25 11:13:44 crc kubenswrapper[4725]: I0225 11:13:44.907547 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.130306 4725 generic.go:334] "Generic (PLEG): container finished" podID="97586ed7-2c87-4ebc-946e-56e4fab86e31" containerID="4f46516fbde8921f0c96488e4649d1fa7b06f5f8cc85d74725a31289b3e9529e" exitCode=0 Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.131038 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-87xvm" event={"ID":"97586ed7-2c87-4ebc-946e-56e4fab86e31","Type":"ContainerDied","Data":"4f46516fbde8921f0c96488e4649d1fa7b06f5f8cc85d74725a31289b3e9529e"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.145305 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-725d-account-create-update-kbmpr" event={"ID":"e8b049d6-afa5-49eb-8bef-64de2f0672b5","Type":"ContainerStarted","Data":"cddf54955eecc79b95cde8526782c391fc3cd16d299108255b1c7ab73d2c671c"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.145339 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-725d-account-create-update-kbmpr" event={"ID":"e8b049d6-afa5-49eb-8bef-64de2f0672b5","Type":"ContainerStarted","Data":"0e9bb43ebc56035e5bfcbfc0f0a52484ed127550cdeb332f141494b107330ee5"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.174486 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-725d-account-create-update-kbmpr" podStartSLOduration=3.17447228 podStartE2EDuration="3.17447228s" podCreationTimestamp="2026-02-25 11:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:45.162310283 +0000 UTC m=+1250.660892318" watchObservedRunningTime="2026-02-25 11:13:45.17447228 +0000 UTC m=+1250.673054305" Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.178285 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" event={"ID":"967eb016-3ed0-4d88-a839-e753c7a6e9a5","Type":"ContainerStarted","Data":"524a8eb709969188a06d2a46537fa1ce142d16e12eb0251ee5ee507047c7cfed"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.178401 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" event={"ID":"967eb016-3ed0-4d88-a839-e753c7a6e9a5","Type":"ContainerStarted","Data":"6bd894ec7e47c52bb624da187f1ea40c5301d8d1b3efc084ba1884fd4b3ae07d"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.182062 4725 generic.go:334] "Generic (PLEG): container finished" podID="ac81c472-c14e-4190-a40d-ed4a19e13dd7" containerID="f3ca06d08bb87840fc4bb186e58f411e36aa95e04d0577939884383a3f0b4967" exitCode=0 Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.182141 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fmmml" event={"ID":"ac81c472-c14e-4190-a40d-ed4a19e13dd7","Type":"ContainerDied","Data":"f3ca06d08bb87840fc4bb186e58f411e36aa95e04d0577939884383a3f0b4967"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.190361 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" event={"ID":"ed75b89a-43a5-4557-b8e2-a8f730bf8e74","Type":"ContainerStarted","Data":"32b877b788a257c358345907585b181f5cf485ba98b9ae4004658a98e3b3c183"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.190400 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" event={"ID":"ed75b89a-43a5-4557-b8e2-a8f730bf8e74","Type":"ContainerStarted","Data":"1acd0e87b8567b13a51b436655bb91d360afb24080e90c619ac272a7af604a0a"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.193694 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerStarted","Data":"3ce19396bf53dc5fb8295664886163a009176ea32c526089be5cada6bd37067b"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.193715 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerStarted","Data":"570f8b61a1f24a7fac63a39c73d07556912e57e847465b9cd79bdf81b7029880"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.195654 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" podStartSLOduration=2.195644628 podStartE2EDuration="2.195644628s" podCreationTimestamp="2026-02-25 11:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:45.194154508 +0000 UTC m=+1250.692736563" watchObservedRunningTime="2026-02-25 11:13:45.195644628 +0000 UTC m=+1250.694226653" Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.200049 4725 generic.go:334] "Generic (PLEG): container finished" podID="15c22dd9-63a4-44f0-a275-bd8d6415fff1" containerID="0893fd2efeac20b904757819bda89379b6915590d061a7499d042f960b8c8660" exitCode=0 Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.200090 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-48b7w" event={"ID":"15c22dd9-63a4-44f0-a275-bd8d6415fff1","Type":"ContainerDied","Data":"0893fd2efeac20b904757819bda89379b6915590d061a7499d042f960b8c8660"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.200116 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-48b7w" event={"ID":"15c22dd9-63a4-44f0-a275-bd8d6415fff1","Type":"ContainerStarted","Data":"74547767ed6cdaca0d33700d4eaeb6bfe1f2fa389b615b29de06c83aafea9964"} Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.213786 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" podStartSLOduration=2.213770385 podStartE2EDuration="2.213770385s" podCreationTimestamp="2026-02-25 11:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:45.208518204 +0000 UTC m=+1250.707100229" watchObservedRunningTime="2026-02-25 11:13:45.213770385 +0000 UTC m=+1250.712352410" Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.253947 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b330a7b3-8fd7-4db6-8d82-257570b2bd58" path="/var/lib/kubelet/pods/b330a7b3-8fd7-4db6-8d82-257570b2bd58/volumes" Feb 25 11:13:45 crc kubenswrapper[4725]: I0225 11:13:45.502359 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.221254 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerStarted","Data":"e651a2feb68184e8326c4d6809ece33693bb0246c3d3ece7737daf08d38e4fca"} Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.223801 4725 generic.go:334] "Generic (PLEG): container finished" podID="e8b049d6-afa5-49eb-8bef-64de2f0672b5" containerID="cddf54955eecc79b95cde8526782c391fc3cd16d299108255b1c7ab73d2c671c" exitCode=0 Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.223907 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-725d-account-create-update-kbmpr" event={"ID":"e8b049d6-afa5-49eb-8bef-64de2f0672b5","Type":"ContainerDied","Data":"cddf54955eecc79b95cde8526782c391fc3cd16d299108255b1c7ab73d2c671c"} Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.229425 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.233730 4725 generic.go:334] "Generic (PLEG): container finished" podID="967eb016-3ed0-4d88-a839-e753c7a6e9a5" containerID="524a8eb709969188a06d2a46537fa1ce142d16e12eb0251ee5ee507047c7cfed" exitCode=0 Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.233793 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" event={"ID":"967eb016-3ed0-4d88-a839-e753c7a6e9a5","Type":"ContainerDied","Data":"524a8eb709969188a06d2a46537fa1ce142d16e12eb0251ee5ee507047c7cfed"} Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.241555 4725 generic.go:334] "Generic (PLEG): container finished" podID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerID="814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3" exitCode=137 Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.241718 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64cd88bfbd-zxddf" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.242187 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cd88bfbd-zxddf" event={"ID":"abad9fb0-482e-4ed1-8bf5-e738ee946358","Type":"ContainerDied","Data":"814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3"} Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.242224 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cd88bfbd-zxddf" event={"ID":"abad9fb0-482e-4ed1-8bf5-e738ee946358","Type":"ContainerDied","Data":"c85eaadb04a5015a0af6f5b45d0cef53dbdbaf4bd7e9981c04a19c8854b66d58"} Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.242246 4725 scope.go:117] "RemoveContainer" containerID="071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.249981 4725 generic.go:334] "Generic (PLEG): container finished" podID="ed75b89a-43a5-4557-b8e2-a8f730bf8e74" containerID="32b877b788a257c358345907585b181f5cf485ba98b9ae4004658a98e3b3c183" exitCode=0 Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.250063 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" event={"ID":"ed75b89a-43a5-4557-b8e2-a8f730bf8e74","Type":"ContainerDied","Data":"32b877b788a257c358345907585b181f5cf485ba98b9ae4004658a98e3b3c183"} Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.259141 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99","Type":"ContainerStarted","Data":"9b5beeef19ce125f510fa04381f59970c379f3b4c3cb87b8ed00dccac7176a94"} Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.384433 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-config-data\") pod \"abad9fb0-482e-4ed1-8bf5-e738ee946358\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.386182 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-combined-ca-bundle\") pod \"abad9fb0-482e-4ed1-8bf5-e738ee946358\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.386253 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-secret-key\") pod \"abad9fb0-482e-4ed1-8bf5-e738ee946358\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.386289 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abad9fb0-482e-4ed1-8bf5-e738ee946358-logs\") pod \"abad9fb0-482e-4ed1-8bf5-e738ee946358\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.386325 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-scripts\") pod \"abad9fb0-482e-4ed1-8bf5-e738ee946358\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.386353 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-tls-certs\") pod \"abad9fb0-482e-4ed1-8bf5-e738ee946358\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.386385 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fhtc\" (UniqueName: \"kubernetes.io/projected/abad9fb0-482e-4ed1-8bf5-e738ee946358-kube-api-access-9fhtc\") pod \"abad9fb0-482e-4ed1-8bf5-e738ee946358\" (UID: \"abad9fb0-482e-4ed1-8bf5-e738ee946358\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.410708 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abad9fb0-482e-4ed1-8bf5-e738ee946358-kube-api-access-9fhtc" (OuterVolumeSpecName: "kube-api-access-9fhtc") pod "abad9fb0-482e-4ed1-8bf5-e738ee946358" (UID: "abad9fb0-482e-4ed1-8bf5-e738ee946358"). InnerVolumeSpecName "kube-api-access-9fhtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.418454 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "abad9fb0-482e-4ed1-8bf5-e738ee946358" (UID: "abad9fb0-482e-4ed1-8bf5-e738ee946358"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.418710 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abad9fb0-482e-4ed1-8bf5-e738ee946358-logs" (OuterVolumeSpecName: "logs") pod "abad9fb0-482e-4ed1-8bf5-e738ee946358" (UID: "abad9fb0-482e-4ed1-8bf5-e738ee946358"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.443937 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-config-data" (OuterVolumeSpecName: "config-data") pod "abad9fb0-482e-4ed1-8bf5-e738ee946358" (UID: "abad9fb0-482e-4ed1-8bf5-e738ee946358"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.447761 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abad9fb0-482e-4ed1-8bf5-e738ee946358" (UID: "abad9fb0-482e-4ed1-8bf5-e738ee946358"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.457045 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-scripts" (OuterVolumeSpecName: "scripts") pod "abad9fb0-482e-4ed1-8bf5-e738ee946358" (UID: "abad9fb0-482e-4ed1-8bf5-e738ee946358"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.466086 4725 scope.go:117] "RemoveContainer" containerID="814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.488436 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fhtc\" (UniqueName: \"kubernetes.io/projected/abad9fb0-482e-4ed1-8bf5-e738ee946358-kube-api-access-9fhtc\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.488467 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.488477 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.488486 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.488494 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abad9fb0-482e-4ed1-8bf5-e738ee946358-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.488504 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abad9fb0-482e-4ed1-8bf5-e738ee946358-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.503987 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "abad9fb0-482e-4ed1-8bf5-e738ee946358" (UID: "abad9fb0-482e-4ed1-8bf5-e738ee946358"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.547625 4725 scope.go:117] "RemoveContainer" containerID="071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f" Feb 25 11:13:46 crc kubenswrapper[4725]: E0225 11:13:46.553040 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f\": container with ID starting with 071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f not found: ID does not exist" containerID="071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.553085 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f"} err="failed to get container status \"071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f\": rpc error: code = NotFound desc = could not find container \"071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f\": container with ID starting with 071600420ebc863ea2aa6f1dad41b5bf3a52349faa95ec4613cce36edf14f54f not found: ID does not exist" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.553109 4725 scope.go:117] "RemoveContainer" containerID="814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3" Feb 25 11:13:46 crc kubenswrapper[4725]: E0225 11:13:46.554081 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3\": container with ID starting with 814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3 not found: ID does not exist" containerID="814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.554132 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3"} err="failed to get container status \"814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3\": rpc error: code = NotFound desc = could not find container \"814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3\": container with ID starting with 814dac3075f2512d425ea0d04b03c6529465682a297786186677c06a282cd7f3 not found: ID does not exist" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.591105 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abad9fb0-482e-4ed1-8bf5-e738ee946358-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.668373 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.727903 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64cd88bfbd-zxddf"] Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.737804 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64cd88bfbd-zxddf"] Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.796097 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfwll\" (UniqueName: \"kubernetes.io/projected/97586ed7-2c87-4ebc-946e-56e4fab86e31-kube-api-access-wfwll\") pod \"97586ed7-2c87-4ebc-946e-56e4fab86e31\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.796135 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97586ed7-2c87-4ebc-946e-56e4fab86e31-operator-scripts\") pod \"97586ed7-2c87-4ebc-946e-56e4fab86e31\" (UID: \"97586ed7-2c87-4ebc-946e-56e4fab86e31\") " Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.797444 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97586ed7-2c87-4ebc-946e-56e4fab86e31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97586ed7-2c87-4ebc-946e-56e4fab86e31" (UID: "97586ed7-2c87-4ebc-946e-56e4fab86e31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.804076 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97586ed7-2c87-4ebc-946e-56e4fab86e31-kube-api-access-wfwll" (OuterVolumeSpecName: "kube-api-access-wfwll") pod "97586ed7-2c87-4ebc-946e-56e4fab86e31" (UID: "97586ed7-2c87-4ebc-946e-56e4fab86e31"). InnerVolumeSpecName "kube-api-access-wfwll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.899359 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfwll\" (UniqueName: \"kubernetes.io/projected/97586ed7-2c87-4ebc-946e-56e4fab86e31-kube-api-access-wfwll\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:46 crc kubenswrapper[4725]: I0225 11:13:46.899393 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97586ed7-2c87-4ebc-946e-56e4fab86e31-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.232246 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.237713 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" path="/var/lib/kubelet/pods/abad9fb0-482e-4ed1-8bf5-e738ee946358/volumes" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.248173 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.273554 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c0e72df9-3fcc-4373-b1af-fac9d1bc5e99","Type":"ContainerStarted","Data":"dbb4de5ce6f0e4861d0bf4ff1ebb9e4fccb8e02bad1458e13ed5013241e3db6e"} Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.274286 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.276975 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-48b7w" event={"ID":"15c22dd9-63a4-44f0-a275-bd8d6415fff1","Type":"ContainerDied","Data":"74547767ed6cdaca0d33700d4eaeb6bfe1f2fa389b615b29de06c83aafea9964"} Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.276978 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-48b7w" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.277013 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74547767ed6cdaca0d33700d4eaeb6bfe1f2fa389b615b29de06c83aafea9964" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.278546 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-87xvm" event={"ID":"97586ed7-2c87-4ebc-946e-56e4fab86e31","Type":"ContainerDied","Data":"9f727c3420bcfde486822fe6730d4cebb76891372f0324df33e83d7ff6050914"} Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.278573 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f727c3420bcfde486822fe6730d4cebb76891372f0324df33e83d7ff6050914" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.278634 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-87xvm" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.282586 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fmmml" event={"ID":"ac81c472-c14e-4190-a40d-ed4a19e13dd7","Type":"ContainerDied","Data":"cc3d4b75378cf9d8a8fc697037261bbe307b342a7b70ee19180ae73a6b9f52d7"} Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.282620 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc3d4b75378cf9d8a8fc697037261bbe307b342a7b70ee19180ae73a6b9f52d7" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.282666 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmmml" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.286071 4725 generic.go:334] "Generic (PLEG): container finished" podID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerID="0e837a9df0516ec462c52022c8a572fd94d7b77ba861d5ed648de97086ec1d9b" exitCode=0 Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.286189 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9448d47d-2x4vh" event={"ID":"36f15650-4f16-4e3b-94cf-a80bcb7c3fde","Type":"ContainerDied","Data":"0e837a9df0516ec462c52022c8a572fd94d7b77ba861d5ed648de97086ec1d9b"} Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.323509 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.798247643 podStartE2EDuration="3.323484584s" podCreationTimestamp="2026-02-25 11:13:44 +0000 UTC" firstStartedPulling="2026-02-25 11:13:45.561743107 +0000 UTC m=+1251.060325122" lastFinishedPulling="2026-02-25 11:13:46.086980038 +0000 UTC m=+1251.585562063" observedRunningTime="2026-02-25 11:13:47.32001827 +0000 UTC m=+1252.818600295" watchObservedRunningTime="2026-02-25 11:13:47.323484584 +0000 UTC m=+1252.822066609" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.412643 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpjhw\" (UniqueName: \"kubernetes.io/projected/ac81c472-c14e-4190-a40d-ed4a19e13dd7-kube-api-access-cpjhw\") pod \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.412678 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zgwm\" (UniqueName: \"kubernetes.io/projected/15c22dd9-63a4-44f0-a275-bd8d6415fff1-kube-api-access-6zgwm\") pod \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.412775 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac81c472-c14e-4190-a40d-ed4a19e13dd7-operator-scripts\") pod \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\" (UID: \"ac81c472-c14e-4190-a40d-ed4a19e13dd7\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.412899 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15c22dd9-63a4-44f0-a275-bd8d6415fff1-operator-scripts\") pod \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\" (UID: \"15c22dd9-63a4-44f0-a275-bd8d6415fff1\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.416474 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15c22dd9-63a4-44f0-a275-bd8d6415fff1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15c22dd9-63a4-44f0-a275-bd8d6415fff1" (UID: "15c22dd9-63a4-44f0-a275-bd8d6415fff1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.417163 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac81c472-c14e-4190-a40d-ed4a19e13dd7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac81c472-c14e-4190-a40d-ed4a19e13dd7" (UID: "ac81c472-c14e-4190-a40d-ed4a19e13dd7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.421223 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac81c472-c14e-4190-a40d-ed4a19e13dd7-kube-api-access-cpjhw" (OuterVolumeSpecName: "kube-api-access-cpjhw") pod "ac81c472-c14e-4190-a40d-ed4a19e13dd7" (UID: "ac81c472-c14e-4190-a40d-ed4a19e13dd7"). InnerVolumeSpecName "kube-api-access-cpjhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.426132 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c22dd9-63a4-44f0-a275-bd8d6415fff1-kube-api-access-6zgwm" (OuterVolumeSpecName: "kube-api-access-6zgwm") pod "15c22dd9-63a4-44f0-a275-bd8d6415fff1" (UID: "15c22dd9-63a4-44f0-a275-bd8d6415fff1"). InnerVolumeSpecName "kube-api-access-6zgwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.515990 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zgwm\" (UniqueName: \"kubernetes.io/projected/15c22dd9-63a4-44f0-a275-bd8d6415fff1-kube-api-access-6zgwm\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.516029 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac81c472-c14e-4190-a40d-ed4a19e13dd7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.516038 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15c22dd9-63a4-44f0-a275-bd8d6415fff1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.516045 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpjhw\" (UniqueName: \"kubernetes.io/projected/ac81c472-c14e-4190-a40d-ed4a19e13dd7-kube-api-access-cpjhw\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.715371 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.820537 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-ovndb-tls-certs\") pod \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.820631 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-httpd-config\") pod \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.820673 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l22ml\" (UniqueName: \"kubernetes.io/projected/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-kube-api-access-l22ml\") pod \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.820764 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-combined-ca-bundle\") pod \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.820798 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-config\") pod \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\" (UID: \"36f15650-4f16-4e3b-94cf-a80bcb7c3fde\") " Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.829535 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "36f15650-4f16-4e3b-94cf-a80bcb7c3fde" (UID: "36f15650-4f16-4e3b-94cf-a80bcb7c3fde"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.854123 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-kube-api-access-l22ml" (OuterVolumeSpecName: "kube-api-access-l22ml") pod "36f15650-4f16-4e3b-94cf-a80bcb7c3fde" (UID: "36f15650-4f16-4e3b-94cf-a80bcb7c3fde"). InnerVolumeSpecName "kube-api-access-l22ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.923266 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.923526 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l22ml\" (UniqueName: \"kubernetes.io/projected/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-kube-api-access-l22ml\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.940848 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36f15650-4f16-4e3b-94cf-a80bcb7c3fde" (UID: "36f15650-4f16-4e3b-94cf-a80bcb7c3fde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.951917 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-config" (OuterVolumeSpecName: "config") pod "36f15650-4f16-4e3b-94cf-a80bcb7c3fde" (UID: "36f15650-4f16-4e3b-94cf-a80bcb7c3fde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:47 crc kubenswrapper[4725]: I0225 11:13:47.966249 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "36f15650-4f16-4e3b-94cf-a80bcb7c3fde" (UID: "36f15650-4f16-4e3b-94cf-a80bcb7c3fde"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.030800 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.031063 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.031073 4725 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f15650-4f16-4e3b-94cf-a80bcb7c3fde-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.034182 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.040602 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.046944 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.132255 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89zd2\" (UniqueName: \"kubernetes.io/projected/e8b049d6-afa5-49eb-8bef-64de2f0672b5-kube-api-access-89zd2\") pod \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.132538 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b049d6-afa5-49eb-8bef-64de2f0672b5-operator-scripts\") pod \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\" (UID: \"e8b049d6-afa5-49eb-8bef-64de2f0672b5\") " Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.133008 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b049d6-afa5-49eb-8bef-64de2f0672b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8b049d6-afa5-49eb-8bef-64de2f0672b5" (UID: "e8b049d6-afa5-49eb-8bef-64de2f0672b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.135873 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b049d6-afa5-49eb-8bef-64de2f0672b5-kube-api-access-89zd2" (OuterVolumeSpecName: "kube-api-access-89zd2") pod "e8b049d6-afa5-49eb-8bef-64de2f0672b5" (UID: "e8b049d6-afa5-49eb-8bef-64de2f0672b5"). InnerVolumeSpecName "kube-api-access-89zd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.237871 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/967eb016-3ed0-4d88-a839-e753c7a6e9a5-operator-scripts\") pod \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.237946 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rjr2\" (UniqueName: \"kubernetes.io/projected/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-kube-api-access-4rjr2\") pod \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.238010 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wncw4\" (UniqueName: \"kubernetes.io/projected/967eb016-3ed0-4d88-a839-e753c7a6e9a5-kube-api-access-wncw4\") pod \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\" (UID: \"967eb016-3ed0-4d88-a839-e753c7a6e9a5\") " Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.238079 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-operator-scripts\") pod \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\" (UID: \"ed75b89a-43a5-4557-b8e2-a8f730bf8e74\") " Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.238741 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967eb016-3ed0-4d88-a839-e753c7a6e9a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "967eb016-3ed0-4d88-a839-e753c7a6e9a5" (UID: "967eb016-3ed0-4d88-a839-e753c7a6e9a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.238803 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed75b89a-43a5-4557-b8e2-a8f730bf8e74" (UID: "ed75b89a-43a5-4557-b8e2-a8f730bf8e74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.241854 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/967eb016-3ed0-4d88-a839-e753c7a6e9a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.241896 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b049d6-afa5-49eb-8bef-64de2f0672b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.241907 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89zd2\" (UniqueName: \"kubernetes.io/projected/e8b049d6-afa5-49eb-8bef-64de2f0672b5-kube-api-access-89zd2\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.241920 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.244628 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-kube-api-access-4rjr2" (OuterVolumeSpecName: "kube-api-access-4rjr2") pod "ed75b89a-43a5-4557-b8e2-a8f730bf8e74" (UID: "ed75b89a-43a5-4557-b8e2-a8f730bf8e74"). InnerVolumeSpecName "kube-api-access-4rjr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.246964 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967eb016-3ed0-4d88-a839-e753c7a6e9a5-kube-api-access-wncw4" (OuterVolumeSpecName: "kube-api-access-wncw4") pod "967eb016-3ed0-4d88-a839-e753c7a6e9a5" (UID: "967eb016-3ed0-4d88-a839-e753c7a6e9a5"). InnerVolumeSpecName "kube-api-access-wncw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.293949 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" event={"ID":"ed75b89a-43a5-4557-b8e2-a8f730bf8e74","Type":"ContainerDied","Data":"1acd0e87b8567b13a51b436655bb91d360afb24080e90c619ac272a7af604a0a"} Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.294184 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1acd0e87b8567b13a51b436655bb91d360afb24080e90c619ac272a7af604a0a" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.294235 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ce2a-account-create-update-4ls5m" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.296070 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerStarted","Data":"f5415ec61438ca5918f8c9f837f6055ce3ef43a8a45d133827259a014c19acdf"} Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.296206 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.296211 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-central-agent" containerID="cri-o://570f8b61a1f24a7fac63a39c73d07556912e57e847465b9cd79bdf81b7029880" gracePeriod=30 Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.296335 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="sg-core" containerID="cri-o://e651a2feb68184e8326c4d6809ece33693bb0246c3d3ece7737daf08d38e4fca" gracePeriod=30 Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.296393 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-notification-agent" containerID="cri-o://3ce19396bf53dc5fb8295664886163a009176ea32c526089be5cada6bd37067b" gracePeriod=30 Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.296473 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="proxy-httpd" containerID="cri-o://f5415ec61438ca5918f8c9f837f6055ce3ef43a8a45d133827259a014c19acdf" gracePeriod=30 Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.304708 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-725d-account-create-update-kbmpr" event={"ID":"e8b049d6-afa5-49eb-8bef-64de2f0672b5","Type":"ContainerDied","Data":"0e9bb43ebc56035e5bfcbfc0f0a52484ed127550cdeb332f141494b107330ee5"} Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.304746 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e9bb43ebc56035e5bfcbfc0f0a52484ed127550cdeb332f141494b107330ee5" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.304808 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-725d-account-create-update-kbmpr" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.307624 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" event={"ID":"967eb016-3ed0-4d88-a839-e753c7a6e9a5","Type":"ContainerDied","Data":"6bd894ec7e47c52bb624da187f1ea40c5301d8d1b3efc084ba1884fd4b3ae07d"} Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.307675 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bd894ec7e47c52bb624da187f1ea40c5301d8d1b3efc084ba1884fd4b3ae07d" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.307637 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c48-account-create-update-pd8sb" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.311791 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9448d47d-2x4vh" event={"ID":"36f15650-4f16-4e3b-94cf-a80bcb7c3fde","Type":"ContainerDied","Data":"1f92df5cf8b21ffedbbbc0812b584252f3ff988ffe144b32fea635a3f909cf8a"} Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.311853 4725 scope.go:117] "RemoveContainer" containerID="8ee55fe701e26882854163396a4b7c2ce444570c5c52397ccaddebebdaabb7ba" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.311957 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9448d47d-2x4vh" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.324469 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.852686675 podStartE2EDuration="6.324453047s" podCreationTimestamp="2026-02-25 11:13:42 +0000 UTC" firstStartedPulling="2026-02-25 11:13:42.884171413 +0000 UTC m=+1248.382753428" lastFinishedPulling="2026-02-25 11:13:47.355937775 +0000 UTC m=+1252.854519800" observedRunningTime="2026-02-25 11:13:48.32086572 +0000 UTC m=+1253.819447745" watchObservedRunningTime="2026-02-25 11:13:48.324453047 +0000 UTC m=+1253.823035072" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.343791 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rjr2\" (UniqueName: \"kubernetes.io/projected/ed75b89a-43a5-4557-b8e2-a8f730bf8e74-kube-api-access-4rjr2\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.343818 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wncw4\" (UniqueName: \"kubernetes.io/projected/967eb016-3ed0-4d88-a839-e753c7a6e9a5-kube-api-access-wncw4\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.491301 4725 scope.go:117] "RemoveContainer" containerID="0e837a9df0516ec462c52022c8a572fd94d7b77ba861d5ed648de97086ec1d9b" Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.506011 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b9448d47d-2x4vh"] Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.513475 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b9448d47d-2x4vh"] Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.834189 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.834619 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-log" containerID="cri-o://607377565ac3041c8ebf6cae37de619f939247e3524af84a69e1df7982db5a95" gracePeriod=30 Feb 25 11:13:48 crc kubenswrapper[4725]: I0225 11:13:48.834698 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-httpd" containerID="cri-o://256749a73a6e2107d4f6e5e9d37f972c00d438e33c8c8460bfe4fbcc9346b834" gracePeriod=30 Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.244889 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" path="/var/lib/kubelet/pods/36f15650-4f16-4e3b-94cf-a80bcb7c3fde/volumes" Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.322784 4725 generic.go:334] "Generic (PLEG): container finished" podID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerID="f5415ec61438ca5918f8c9f837f6055ce3ef43a8a45d133827259a014c19acdf" exitCode=0 Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.322845 4725 generic.go:334] "Generic (PLEG): container finished" podID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerID="e651a2feb68184e8326c4d6809ece33693bb0246c3d3ece7737daf08d38e4fca" exitCode=2 Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.322856 4725 generic.go:334] "Generic (PLEG): container finished" podID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerID="3ce19396bf53dc5fb8295664886163a009176ea32c526089be5cada6bd37067b" exitCode=0 Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.322859 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerDied","Data":"f5415ec61438ca5918f8c9f837f6055ce3ef43a8a45d133827259a014c19acdf"} Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.322903 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerDied","Data":"e651a2feb68184e8326c4d6809ece33693bb0246c3d3ece7737daf08d38e4fca"} Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.322918 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerDied","Data":"3ce19396bf53dc5fb8295664886163a009176ea32c526089be5cada6bd37067b"} Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.325777 4725 generic.go:334] "Generic (PLEG): container finished" podID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerID="607377565ac3041c8ebf6cae37de619f939247e3524af84a69e1df7982db5a95" exitCode=143 Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.325864 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf04584d-e28f-4010-91c0-0dafe5dde54c","Type":"ContainerDied","Data":"607377565ac3041c8ebf6cae37de619f939247e3524af84a69e1df7982db5a95"} Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.651528 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.651855 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-log" containerID="cri-o://8b367172e8919f938670f03a6303378703dfbba29b2de04882da1c7955816207" gracePeriod=30 Feb 25 11:13:49 crc kubenswrapper[4725]: I0225 11:13:49.651942 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-httpd" containerID="cri-o://01fe3b1ee2f8aa8ca4385d279b32ba554348f15c838f6ba17a89bae0bc2fb4a5" gracePeriod=30 Feb 25 11:13:50 crc kubenswrapper[4725]: I0225 11:13:50.338330 4725 generic.go:334] "Generic (PLEG): container finished" podID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerID="8b367172e8919f938670f03a6303378703dfbba29b2de04882da1c7955816207" exitCode=143 Feb 25 11:13:50 crc kubenswrapper[4725]: I0225 11:13:50.338507 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91566ab6-1ac2-4b2b-b705-c049b68e1ab1","Type":"ContainerDied","Data":"8b367172e8919f938670f03a6303378703dfbba29b2de04882da1c7955816207"} Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.366115 4725 generic.go:334] "Generic (PLEG): container finished" podID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerID="256749a73a6e2107d4f6e5e9d37f972c00d438e33c8c8460bfe4fbcc9346b834" exitCode=0 Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.366811 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf04584d-e28f-4010-91c0-0dafe5dde54c","Type":"ContainerDied","Data":"256749a73a6e2107d4f6e5e9d37f972c00d438e33c8c8460bfe4fbcc9346b834"} Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.584104 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.733666 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-httpd-run\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.733775 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-logs\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.733901 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-public-tls-certs\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.733960 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-config-data\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.734011 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ntwj\" (UniqueName: \"kubernetes.io/projected/bf04584d-e28f-4010-91c0-0dafe5dde54c-kube-api-access-8ntwj\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.734079 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.734125 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-scripts\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.734153 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-combined-ca-bundle\") pod \"bf04584d-e28f-4010-91c0-0dafe5dde54c\" (UID: \"bf04584d-e28f-4010-91c0-0dafe5dde54c\") " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.734297 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.734602 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.735215 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-logs" (OuterVolumeSpecName: "logs") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.739871 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf04584d-e28f-4010-91c0-0dafe5dde54c-kube-api-access-8ntwj" (OuterVolumeSpecName: "kube-api-access-8ntwj") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "kube-api-access-8ntwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.741397 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-scripts" (OuterVolumeSpecName: "scripts") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.742513 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.790449 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.790728 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-config-data" (OuterVolumeSpecName: "config-data") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.805246 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bf04584d-e28f-4010-91c0-0dafe5dde54c" (UID: "bf04584d-e28f-4010-91c0-0dafe5dde54c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.836999 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf04584d-e28f-4010-91c0-0dafe5dde54c-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.837045 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.837060 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.837074 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ntwj\" (UniqueName: \"kubernetes.io/projected/bf04584d-e28f-4010-91c0-0dafe5dde54c-kube-api-access-8ntwj\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.837110 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.837122 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.837136 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf04584d-e28f-4010-91c0-0dafe5dde54c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.859626 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 25 11:13:52 crc kubenswrapper[4725]: I0225 11:13:52.938403 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.067007 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91566ab6_1ac2_4b2b_b705_c049b68e1ab1.slice/crio-01fe3b1ee2f8aa8ca4385d279b32ba554348f15c838f6ba17a89bae0bc2fb4a5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91566ab6_1ac2_4b2b_b705_c049b68e1ab1.slice/crio-conmon-01fe3b1ee2f8aa8ca4385d279b32ba554348f15c838f6ba17a89bae0bc2fb4a5.scope\": RecentStats: unable to find data in memory cache]" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.398813 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf04584d-e28f-4010-91c0-0dafe5dde54c","Type":"ContainerDied","Data":"147062aa4bb8bf51b9bbd7b380a54cc4ab79525d6c429e39bf34f18b5c3d35f2"} Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.399171 4725 scope.go:117] "RemoveContainer" containerID="256749a73a6e2107d4f6e5e9d37f972c00d438e33c8c8460bfe4fbcc9346b834" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.398849 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.414918 4725 generic.go:334] "Generic (PLEG): container finished" podID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerID="01fe3b1ee2f8aa8ca4385d279b32ba554348f15c838f6ba17a89bae0bc2fb4a5" exitCode=0 Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.414990 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91566ab6-1ac2-4b2b-b705-c049b68e1ab1","Type":"ContainerDied","Data":"01fe3b1ee2f8aa8ca4385d279b32ba554348f15c838f6ba17a89bae0bc2fb4a5"} Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.415021 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91566ab6-1ac2-4b2b-b705-c049b68e1ab1","Type":"ContainerDied","Data":"0589e5bcc2fe24c3ccea0736d09bbea4dac750f397d73ccefb8533fa881437e4"} Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.415034 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0589e5bcc2fe24c3ccea0736d09bbea4dac750f397d73ccefb8533fa881437e4" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.422118 4725 generic.go:334] "Generic (PLEG): container finished" podID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerID="570f8b61a1f24a7fac63a39c73d07556912e57e847465b9cd79bdf81b7029880" exitCode=0 Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.422164 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerDied","Data":"570f8b61a1f24a7fac63a39c73d07556912e57e847465b9cd79bdf81b7029880"} Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.462952 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grc9g"] Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463293 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463308 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463319 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed75b89a-43a5-4557-b8e2-a8f730bf8e74" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463326 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed75b89a-43a5-4557-b8e2-a8f730bf8e74" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463348 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97586ed7-2c87-4ebc-946e-56e4fab86e31" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463354 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="97586ed7-2c87-4ebc-946e-56e4fab86e31" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463365 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-log" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463371 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-log" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463378 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-api" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463385 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-api" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463397 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b049d6-afa5-49eb-8bef-64de2f0672b5" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463404 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b049d6-afa5-49eb-8bef-64de2f0672b5" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463414 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon-log" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463420 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon-log" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463432 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463437 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463448 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac81c472-c14e-4190-a40d-ed4a19e13dd7" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463453 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac81c472-c14e-4190-a40d-ed4a19e13dd7" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463465 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463470 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463480 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="967eb016-3ed0-4d88-a839-e753c7a6e9a5" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463486 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="967eb016-3ed0-4d88-a839-e753c7a6e9a5" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.463495 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c22dd9-63a4-44f0-a275-bd8d6415fff1" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463501 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c22dd9-63a4-44f0-a275-bd8d6415fff1" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463645 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b049d6-afa5-49eb-8bef-64de2f0672b5" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463655 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463671 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463681 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="97586ed7-2c87-4ebc-946e-56e4fab86e31" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463690 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac81c472-c14e-4190-a40d-ed4a19e13dd7" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463700 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="967eb016-3ed0-4d88-a839-e753c7a6e9a5" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463712 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="abad9fb0-482e-4ed1-8bf5-e738ee946358" containerName="horizon-log" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463722 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f15650-4f16-4e3b-94cf-a80bcb7c3fde" containerName="neutron-api" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463731 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed75b89a-43a5-4557-b8e2-a8f730bf8e74" containerName="mariadb-account-create-update" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463742 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463750 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" containerName="glance-log" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.463756 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c22dd9-63a4-44f0-a275-bd8d6415fff1" containerName="mariadb-database-create" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.464304 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.471143 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.471517 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rp7gg" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.471744 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.473386 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grc9g"] Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.478127 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.503626 4725 scope.go:117] "RemoveContainer" containerID="607377565ac3041c8ebf6cae37de619f939247e3524af84a69e1df7982db5a95" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.547964 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.568456 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.589494 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.589887 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.589901 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: E0225 11:13:53.589917 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-log" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.589925 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-log" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.590082 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-log" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.590094 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" containerName="glance-httpd" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.597139 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.597273 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.601980 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.602159 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.656748 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-internal-tls-certs\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.656909 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-config-data\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.656937 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-httpd-run\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.656998 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-logs\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657020 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-combined-ca-bundle\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657064 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657148 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brqpt\" (UniqueName: \"kubernetes.io/projected/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-kube-api-access-brqpt\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657245 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-scripts\") pod \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\" (UID: \"91566ab6-1ac2-4b2b-b705-c049b68e1ab1\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657588 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-scripts\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657640 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-config-data\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657783 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.657816 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdwll\" (UniqueName: \"kubernetes.io/projected/757bb635-edf2-4081-a9a1-fdc66588e0aa-kube-api-access-cdwll\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.658161 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-logs" (OuterVolumeSpecName: "logs") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.663635 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.663648 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.664869 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-kube-api-access-brqpt" (OuterVolumeSpecName: "kube-api-access-brqpt") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "kube-api-access-brqpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.666600 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-scripts" (OuterVolumeSpecName: "scripts") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.698966 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.737773 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.744450 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.744505 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-config-data" (OuterVolumeSpecName: "config-data") pod "91566ab6-1ac2-4b2b-b705-c049b68e1ab1" (UID: "91566ab6-1ac2-4b2b-b705-c049b68e1ab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.759396 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-logs\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762135 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762221 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762347 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762438 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4nd\" (UniqueName: \"kubernetes.io/projected/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-kube-api-access-5c4nd\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762489 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762537 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdwll\" (UniqueName: \"kubernetes.io/projected/757bb635-edf2-4081-a9a1-fdc66588e0aa-kube-api-access-cdwll\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762665 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762691 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762745 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-scripts\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762778 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-config-data\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762804 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762908 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762926 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762946 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762956 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brqpt\" (UniqueName: \"kubernetes.io/projected/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-kube-api-access-brqpt\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762965 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762975 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762984 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.762992 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91566ab6-1ac2-4b2b-b705-c049b68e1ab1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.769763 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.770182 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-scripts\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.774344 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-config-data\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.801390 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdwll\" (UniqueName: \"kubernetes.io/projected/757bb635-edf2-4081-a9a1-fdc66588e0aa-kube-api-access-cdwll\") pod \"nova-cell0-conductor-db-sync-grc9g\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.811136 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.811741 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.864491 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-run-httpd\") pod \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.864585 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7ndv\" (UniqueName: \"kubernetes.io/projected/1a344b84-2809-4ecd-87eb-2381acb5c9d8-kube-api-access-q7ndv\") pod \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.864629 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-config-data\") pod \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.864665 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-scripts\") pod \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.864749 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-sg-core-conf-yaml\") pod \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.864773 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-log-httpd\") pod \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.864814 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-combined-ca-bundle\") pod \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\" (UID: \"1a344b84-2809-4ecd-87eb-2381acb5c9d8\") " Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865018 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865065 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865126 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-logs\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865150 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865168 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865205 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865237 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4nd\" (UniqueName: \"kubernetes.io/projected/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-kube-api-access-5c4nd\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865298 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.865339 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.866460 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.869353 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a344b84-2809-4ecd-87eb-2381acb5c9d8" (UID: "1a344b84-2809-4ecd-87eb-2381acb5c9d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.872690 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.872744 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a344b84-2809-4ecd-87eb-2381acb5c9d8" (UID: "1a344b84-2809-4ecd-87eb-2381acb5c9d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.873028 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.873253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.873287 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-logs\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.874839 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.880024 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-scripts" (OuterVolumeSpecName: "scripts") pod "1a344b84-2809-4ecd-87eb-2381acb5c9d8" (UID: "1a344b84-2809-4ecd-87eb-2381acb5c9d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.882217 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a344b84-2809-4ecd-87eb-2381acb5c9d8-kube-api-access-q7ndv" (OuterVolumeSpecName: "kube-api-access-q7ndv") pod "1a344b84-2809-4ecd-87eb-2381acb5c9d8" (UID: "1a344b84-2809-4ecd-87eb-2381acb5c9d8"). InnerVolumeSpecName "kube-api-access-q7ndv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.884935 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.891448 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4nd\" (UniqueName: \"kubernetes.io/projected/c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e-kube-api-access-5c4nd\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.908518 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a344b84-2809-4ecd-87eb-2381acb5c9d8" (UID: "1a344b84-2809-4ecd-87eb-2381acb5c9d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.936457 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e\") " pod="openstack/glance-default-external-api-0" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.968271 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.968521 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.968533 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.968544 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a344b84-2809-4ecd-87eb-2381acb5c9d8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:53 crc kubenswrapper[4725]: I0225 11:13:53.968558 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7ndv\" (UniqueName: \"kubernetes.io/projected/1a344b84-2809-4ecd-87eb-2381acb5c9d8-kube-api-access-q7ndv\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.017784 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a344b84-2809-4ecd-87eb-2381acb5c9d8" (UID: "1a344b84-2809-4ecd-87eb-2381acb5c9d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.058470 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-config-data" (OuterVolumeSpecName: "config-data") pod "1a344b84-2809-4ecd-87eb-2381acb5c9d8" (UID: "1a344b84-2809-4ecd-87eb-2381acb5c9d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.069903 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.069937 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a344b84-2809-4ecd-87eb-2381acb5c9d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.139429 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grc9g"] Feb 25 11:13:54 crc kubenswrapper[4725]: W0225 11:13:54.153051 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod757bb635_edf2_4081_a9a1_fdc66588e0aa.slice/crio-e9a30accf103bf929950845481713438297bb742f98a6200849e84ce14602eb5 WatchSource:0}: Error finding container e9a30accf103bf929950845481713438297bb742f98a6200849e84ce14602eb5: Status 404 returned error can't find the container with id e9a30accf103bf929950845481713438297bb742f98a6200849e84ce14602eb5 Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.214475 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.439345 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grc9g" event={"ID":"757bb635-edf2-4081-a9a1-fdc66588e0aa","Type":"ContainerStarted","Data":"e9a30accf103bf929950845481713438297bb742f98a6200849e84ce14602eb5"} Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.446330 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.446387 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.446443 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a344b84-2809-4ecd-87eb-2381acb5c9d8","Type":"ContainerDied","Data":"757eeda42b552cbae2dacd6f0ac4c4c44bcd08959a36ef9546590981815b5cbc"} Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.446479 4725 scope.go:117] "RemoveContainer" containerID="f5415ec61438ca5918f8c9f837f6055ce3ef43a8a45d133827259a014c19acdf" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.499911 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.511245 4725 scope.go:117] "RemoveContainer" containerID="e651a2feb68184e8326c4d6809ece33693bb0246c3d3ece7737daf08d38e4fca" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.512074 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.531235 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535179 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: E0225 11:13:54.535548 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="sg-core" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535563 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="sg-core" Feb 25 11:13:54 crc kubenswrapper[4725]: E0225 11:13:54.535574 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="proxy-httpd" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535581 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="proxy-httpd" Feb 25 11:13:54 crc kubenswrapper[4725]: E0225 11:13:54.535604 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-notification-agent" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535612 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-notification-agent" Feb 25 11:13:54 crc kubenswrapper[4725]: E0225 11:13:54.535623 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-central-agent" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535628 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-central-agent" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535777 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="proxy-httpd" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535795 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-central-agent" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535806 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="ceilometer-notification-agent" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.535815 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" containerName="sg-core" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.536707 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.539399 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.539625 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.551774 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.558521 4725 scope.go:117] "RemoveContainer" containerID="3ce19396bf53dc5fb8295664886163a009176ea32c526089be5cada6bd37067b" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.564042 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.573920 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.576539 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.580871 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.581012 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.581103 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.590785 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.596816 4725 scope.go:117] "RemoveContainer" containerID="570f8b61a1f24a7fac63a39c73d07556912e57e847465b9cd79bdf81b7029880" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.679992 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-scripts\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680031 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-log-httpd\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680060 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680080 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82vdm\" (UniqueName: \"kubernetes.io/projected/e443ecd8-7a45-4762-866f-f682b221da44-kube-api-access-82vdm\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680106 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680126 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-run-httpd\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680151 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680180 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680222 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-config-data\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680240 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/993eb2eb-155b-419e-85a7-c59a25492dda-logs\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680279 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680319 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680358 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680383 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/993eb2eb-155b-419e-85a7-c59a25492dda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680410 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.680435 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msz9x\" (UniqueName: \"kubernetes.io/projected/993eb2eb-155b-419e-85a7-c59a25492dda-kube-api-access-msz9x\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.770231 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 11:13:54 crc kubenswrapper[4725]: W0225 11:13:54.776174 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e3ea4a_8acb_4eee_a051_82ef6d7dad0e.slice/crio-b637b3aa5f99f931893a723274b490cd8b21e5aa132e0928923ab10a2836f37d WatchSource:0}: Error finding container b637b3aa5f99f931893a723274b490cd8b21e5aa132e0928923ab10a2836f37d: Status 404 returned error can't find the container with id b637b3aa5f99f931893a723274b490cd8b21e5aa132e0928923ab10a2836f37d Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.782121 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-run-httpd\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.782633 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-run-httpd\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.782700 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.783601 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.783708 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-config-data\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.783734 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/993eb2eb-155b-419e-85a7-c59a25492dda-logs\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.783797 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.783876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.783944 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.783987 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/993eb2eb-155b-419e-85a7-c59a25492dda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784022 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784047 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msz9x\" (UniqueName: \"kubernetes.io/projected/993eb2eb-155b-419e-85a7-c59a25492dda-kube-api-access-msz9x\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784127 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-scripts\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784157 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-log-httpd\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784222 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82vdm\" (UniqueName: \"kubernetes.io/projected/e443ecd8-7a45-4762-866f-f682b221da44-kube-api-access-82vdm\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.784260 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/993eb2eb-155b-419e-85a7-c59a25492dda-logs\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.785171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/993eb2eb-155b-419e-85a7-c59a25492dda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.785387 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.786615 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-log-httpd\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.789911 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.791524 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-config-data\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.796891 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.797232 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.797491 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.797863 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-scripts\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.799082 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.807283 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.810878 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993eb2eb-155b-419e-85a7-c59a25492dda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.814733 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82vdm\" (UniqueName: \"kubernetes.io/projected/e443ecd8-7a45-4762-866f-f682b221da44-kube-api-access-82vdm\") pod \"ceilometer-0\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.817671 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msz9x\" (UniqueName: \"kubernetes.io/projected/993eb2eb-155b-419e-85a7-c59a25492dda-kube-api-access-msz9x\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.830854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"993eb2eb-155b-419e-85a7-c59a25492dda\") " pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.870710 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.909632 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:13:54 crc kubenswrapper[4725]: I0225 11:13:54.993625 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.235066 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a344b84-2809-4ecd-87eb-2381acb5c9d8" path="/var/lib/kubelet/pods/1a344b84-2809-4ecd-87eb-2381acb5c9d8/volumes" Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.237269 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91566ab6-1ac2-4b2b-b705-c049b68e1ab1" path="/var/lib/kubelet/pods/91566ab6-1ac2-4b2b-b705-c049b68e1ab1/volumes" Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.271114 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf04584d-e28f-4010-91c0-0dafe5dde54c" path="/var/lib/kubelet/pods/bf04584d-e28f-4010-91c0-0dafe5dde54c/volumes" Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.396237 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 11:13:55 crc kubenswrapper[4725]: W0225 11:13:55.400614 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod993eb2eb_155b_419e_85a7_c59a25492dda.slice/crio-500df125002a8c00e3e14187e16eb532893345f2b1f1f86062bde28fa6634175 WatchSource:0}: Error finding container 500df125002a8c00e3e14187e16eb532893345f2b1f1f86062bde28fa6634175: Status 404 returned error can't find the container with id 500df125002a8c00e3e14187e16eb532893345f2b1f1f86062bde28fa6634175 Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.476130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"993eb2eb-155b-419e-85a7-c59a25492dda","Type":"ContainerStarted","Data":"500df125002a8c00e3e14187e16eb532893345f2b1f1f86062bde28fa6634175"} Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.481686 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e","Type":"ContainerStarted","Data":"0897a42bbf79c7a626fac66bd08d4d412cd9202fd6d9e32e2c28c12a6bd7e62d"} Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.481735 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e","Type":"ContainerStarted","Data":"b637b3aa5f99f931893a723274b490cd8b21e5aa132e0928923ab10a2836f37d"} Feb 25 11:13:55 crc kubenswrapper[4725]: I0225 11:13:55.499183 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:56 crc kubenswrapper[4725]: I0225 11:13:56.505290 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerStarted","Data":"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0"} Feb 25 11:13:56 crc kubenswrapper[4725]: I0225 11:13:56.505645 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerStarted","Data":"0525d3a5623bfc7a4267d77d15949a471e41a478a9eb07da640dcad1dcb872f5"} Feb 25 11:13:56 crc kubenswrapper[4725]: I0225 11:13:56.508209 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"993eb2eb-155b-419e-85a7-c59a25492dda","Type":"ContainerStarted","Data":"11e621727efcdf534aeef41648815da4df065d49ed0f6caa7699591d159ea223"} Feb 25 11:13:56 crc kubenswrapper[4725]: I0225 11:13:56.509593 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e","Type":"ContainerStarted","Data":"3c64affaca7c58e8f0c15bf4d822b14d04df5f6c84d470746c14cf7e4bef0e91"} Feb 25 11:13:56 crc kubenswrapper[4725]: I0225 11:13:56.534336 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.534319095 podStartE2EDuration="3.534319095s" podCreationTimestamp="2026-02-25 11:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:56.527437221 +0000 UTC m=+1262.026019246" watchObservedRunningTime="2026-02-25 11:13:56.534319095 +0000 UTC m=+1262.032901110" Feb 25 11:13:56 crc kubenswrapper[4725]: I0225 11:13:56.852126 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:13:57 crc kubenswrapper[4725]: I0225 11:13:57.522625 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"993eb2eb-155b-419e-85a7-c59a25492dda","Type":"ContainerStarted","Data":"d92c2ba9d435fdd42e1d7ee1ae4bc6e997a0bf1987c679519ec4efefe092e61a"} Feb 25 11:13:57 crc kubenswrapper[4725]: I0225 11:13:57.527378 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerStarted","Data":"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f"} Feb 25 11:13:57 crc kubenswrapper[4725]: I0225 11:13:57.527429 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerStarted","Data":"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7"} Feb 25 11:13:57 crc kubenswrapper[4725]: I0225 11:13:57.549591 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.549570571 podStartE2EDuration="3.549570571s" podCreationTimestamp="2026-02-25 11:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:13:57.54096014 +0000 UTC m=+1263.039542185" watchObservedRunningTime="2026-02-25 11:13:57.549570571 +0000 UTC m=+1263.048152596" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.139653 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533634-rq22r"] Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.141564 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533634-rq22r" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.143713 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.144084 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.144132 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.166793 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533634-rq22r"] Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.301915 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnsw\" (UniqueName: \"kubernetes.io/projected/532209c0-1111-4779-9620-732c8d611e1c-kube-api-access-vwnsw\") pod \"auto-csr-approver-29533634-rq22r\" (UID: \"532209c0-1111-4779-9620-732c8d611e1c\") " pod="openshift-infra/auto-csr-approver-29533634-rq22r" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.403794 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnsw\" (UniqueName: \"kubernetes.io/projected/532209c0-1111-4779-9620-732c8d611e1c-kube-api-access-vwnsw\") pod \"auto-csr-approver-29533634-rq22r\" (UID: \"532209c0-1111-4779-9620-732c8d611e1c\") " pod="openshift-infra/auto-csr-approver-29533634-rq22r" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.422554 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnsw\" (UniqueName: \"kubernetes.io/projected/532209c0-1111-4779-9620-732c8d611e1c-kube-api-access-vwnsw\") pod \"auto-csr-approver-29533634-rq22r\" (UID: \"532209c0-1111-4779-9620-732c8d611e1c\") " pod="openshift-infra/auto-csr-approver-29533634-rq22r" Feb 25 11:14:00 crc kubenswrapper[4725]: I0225 11:14:00.473968 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533634-rq22r" Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.110271 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533634-rq22r"] Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.587254 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-central-agent" containerID="cri-o://eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" gracePeriod=30 Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.587383 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="proxy-httpd" containerID="cri-o://e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" gracePeriod=30 Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.587410 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="sg-core" containerID="cri-o://d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" gracePeriod=30 Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.587207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerStarted","Data":"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a"} Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.587724 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.587463 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-notification-agent" containerID="cri-o://726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" gracePeriod=30 Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.595749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grc9g" event={"ID":"757bb635-edf2-4081-a9a1-fdc66588e0aa","Type":"ContainerStarted","Data":"c9bf8f113504797681a6e783d046c1d61dae84c398e454734223e6fd8ce114b7"} Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.602138 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533634-rq22r" event={"ID":"532209c0-1111-4779-9620-732c8d611e1c","Type":"ContainerStarted","Data":"4daff68be7e5b6ac625bdbdaa205cdc527f1d19c31c627cebd5599ccd44a07dc"} Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.633101 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.465519618 podStartE2EDuration="9.633075915s" podCreationTimestamp="2026-02-25 11:13:54 +0000 UTC" firstStartedPulling="2026-02-25 11:13:55.534899824 +0000 UTC m=+1261.033481849" lastFinishedPulling="2026-02-25 11:14:02.702456081 +0000 UTC m=+1268.201038146" observedRunningTime="2026-02-25 11:14:03.6288084 +0000 UTC m=+1269.127390475" watchObservedRunningTime="2026-02-25 11:14:03.633075915 +0000 UTC m=+1269.131657980" Feb 25 11:14:03 crc kubenswrapper[4725]: I0225 11:14:03.663381 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-grc9g" podStartSLOduration=2.11383344 podStartE2EDuration="10.663354298s" podCreationTimestamp="2026-02-25 11:13:53 +0000 UTC" firstStartedPulling="2026-02-25 11:13:54.158477021 +0000 UTC m=+1259.657059046" lastFinishedPulling="2026-02-25 11:14:02.707997879 +0000 UTC m=+1268.206579904" observedRunningTime="2026-02-25 11:14:03.655059015 +0000 UTC m=+1269.153641050" watchObservedRunningTime="2026-02-25 11:14:03.663354298 +0000 UTC m=+1269.161936363" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.215146 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.215400 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.246258 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.258081 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.285775 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.373800 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-scripts\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.373868 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-sg-core-conf-yaml\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.373942 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-run-httpd\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.373993 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-ceilometer-tls-certs\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.374068 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82vdm\" (UniqueName: \"kubernetes.io/projected/e443ecd8-7a45-4762-866f-f682b221da44-kube-api-access-82vdm\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.374096 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-combined-ca-bundle\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.374125 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-log-httpd\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.374159 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-config-data\") pod \"e443ecd8-7a45-4762-866f-f682b221da44\" (UID: \"e443ecd8-7a45-4762-866f-f682b221da44\") " Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.374324 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.374658 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.374790 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.379632 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e443ecd8-7a45-4762-866f-f682b221da44-kube-api-access-82vdm" (OuterVolumeSpecName: "kube-api-access-82vdm") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "kube-api-access-82vdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.379498 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-scripts" (OuterVolumeSpecName: "scripts") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.412272 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.465720 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.476228 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.476274 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.476295 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82vdm\" (UniqueName: \"kubernetes.io/projected/e443ecd8-7a45-4762-866f-f682b221da44-kube-api-access-82vdm\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.476313 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e443ecd8-7a45-4762-866f-f682b221da44-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.476327 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.480206 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-config-data" (OuterVolumeSpecName: "config-data") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.496675 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e443ecd8-7a45-4762-866f-f682b221da44" (UID: "e443ecd8-7a45-4762-866f-f682b221da44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.578248 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.578279 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443ecd8-7a45-4762-866f-f682b221da44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.617999 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533634-rq22r" event={"ID":"532209c0-1111-4779-9620-732c8d611e1c","Type":"ContainerStarted","Data":"0f1f3c64366c6ea77197b031bf340956d44d6efe1cac7cb78e09e8e0f77ea6d9"} Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621184 4725 generic.go:334] "Generic (PLEG): container finished" podID="e443ecd8-7a45-4762-866f-f682b221da44" containerID="e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" exitCode=0 Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621219 4725 generic.go:334] "Generic (PLEG): container finished" podID="e443ecd8-7a45-4762-866f-f682b221da44" containerID="d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" exitCode=2 Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621227 4725 generic.go:334] "Generic (PLEG): container finished" podID="e443ecd8-7a45-4762-866f-f682b221da44" containerID="726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" exitCode=0 Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621232 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621253 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerDied","Data":"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a"} Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621287 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerDied","Data":"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f"} Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621302 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerDied","Data":"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7"} Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerDied","Data":"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0"} Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621238 4725 generic.go:334] "Generic (PLEG): container finished" podID="e443ecd8-7a45-4762-866f-f682b221da44" containerID="eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" exitCode=0 Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621331 4725 scope.go:117] "RemoveContainer" containerID="e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.621429 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e443ecd8-7a45-4762-866f-f682b221da44","Type":"ContainerDied","Data":"0525d3a5623bfc7a4267d77d15949a471e41a478a9eb07da640dcad1dcb872f5"} Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.622576 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.622642 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.638028 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533634-rq22r" podStartSLOduration=3.630104296 podStartE2EDuration="4.638012285s" podCreationTimestamp="2026-02-25 11:14:00 +0000 UTC" firstStartedPulling="2026-02-25 11:14:03.124561743 +0000 UTC m=+1268.623143778" lastFinishedPulling="2026-02-25 11:14:04.132469702 +0000 UTC m=+1269.631051767" observedRunningTime="2026-02-25 11:14:04.637734657 +0000 UTC m=+1270.136316682" watchObservedRunningTime="2026-02-25 11:14:04.638012285 +0000 UTC m=+1270.136594320" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.646461 4725 scope.go:117] "RemoveContainer" containerID="d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.675808 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.695553 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.708785 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.709283 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-notification-agent" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709307 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-notification-agent" Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.709334 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-central-agent" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709343 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-central-agent" Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.709363 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="proxy-httpd" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709372 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="proxy-httpd" Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.709397 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="sg-core" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709404 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="sg-core" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709632 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-central-agent" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709655 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="proxy-httpd" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709674 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="sg-core" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.709692 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e443ecd8-7a45-4762-866f-f682b221da44" containerName="ceilometer-notification-agent" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.711597 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.714739 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.714908 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.714944 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.744145 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.764922 4725 scope.go:117] "RemoveContainer" containerID="726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.786385 4725 scope.go:117] "RemoveContainer" containerID="eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.806251 4725 scope.go:117] "RemoveContainer" containerID="e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.806666 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": container with ID starting with e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a not found: ID does not exist" containerID="e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.806689 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a"} err="failed to get container status \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": rpc error: code = NotFound desc = could not find container \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": container with ID starting with e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.806710 4725 scope.go:117] "RemoveContainer" containerID="d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.807233 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": container with ID starting with d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f not found: ID does not exist" containerID="d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.807253 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f"} err="failed to get container status \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": rpc error: code = NotFound desc = could not find container \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": container with ID starting with d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.807266 4725 scope.go:117] "RemoveContainer" containerID="726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.807646 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": container with ID starting with 726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7 not found: ID does not exist" containerID="726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.807699 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7"} err="failed to get container status \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": rpc error: code = NotFound desc = could not find container \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": container with ID starting with 726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.807733 4725 scope.go:117] "RemoveContainer" containerID="eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" Feb 25 11:14:04 crc kubenswrapper[4725]: E0225 11:14:04.808075 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": container with ID starting with eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0 not found: ID does not exist" containerID="eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.808109 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0"} err="failed to get container status \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": rpc error: code = NotFound desc = could not find container \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": container with ID starting with eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.808130 4725 scope.go:117] "RemoveContainer" containerID="e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.808581 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a"} err="failed to get container status \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": rpc error: code = NotFound desc = could not find container \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": container with ID starting with e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.808606 4725 scope.go:117] "RemoveContainer" containerID="d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.808900 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f"} err="failed to get container status \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": rpc error: code = NotFound desc = could not find container \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": container with ID starting with d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.808919 4725 scope.go:117] "RemoveContainer" containerID="726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.809328 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7"} err="failed to get container status \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": rpc error: code = NotFound desc = could not find container \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": container with ID starting with 726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.809349 4725 scope.go:117] "RemoveContainer" containerID="eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.809559 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0"} err="failed to get container status \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": rpc error: code = NotFound desc = could not find container \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": container with ID starting with eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.809583 4725 scope.go:117] "RemoveContainer" containerID="e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.810023 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a"} err="failed to get container status \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": rpc error: code = NotFound desc = could not find container \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": container with ID starting with e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.810047 4725 scope.go:117] "RemoveContainer" containerID="d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.810342 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f"} err="failed to get container status \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": rpc error: code = NotFound desc = could not find container \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": container with ID starting with d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.810370 4725 scope.go:117] "RemoveContainer" containerID="726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.810649 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7"} err="failed to get container status \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": rpc error: code = NotFound desc = could not find container \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": container with ID starting with 726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.810676 4725 scope.go:117] "RemoveContainer" containerID="eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811169 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0"} err="failed to get container status \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": rpc error: code = NotFound desc = could not find container \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": container with ID starting with eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811189 4725 scope.go:117] "RemoveContainer" containerID="e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811386 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a"} err="failed to get container status \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": rpc error: code = NotFound desc = could not find container \"e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a\": container with ID starting with e0424e40db03dee7f0de9de3eafa68863feb4a9881d41c86877bf5361fff321a not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811406 4725 scope.go:117] "RemoveContainer" containerID="d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811660 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f"} err="failed to get container status \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": rpc error: code = NotFound desc = could not find container \"d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f\": container with ID starting with d191593d58bb85d91b100c7b8440b6ae8ce75079191d48c0bd9034c52a56df4f not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811678 4725 scope.go:117] "RemoveContainer" containerID="726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811860 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7"} err="failed to get container status \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": rpc error: code = NotFound desc = could not find container \"726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7\": container with ID starting with 726ef5413ac7e0a6f643e7e59bac88b2fecf0497202b3599041be35596a5edd7 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.811878 4725 scope.go:117] "RemoveContainer" containerID="eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.812079 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0"} err="failed to get container status \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": rpc error: code = NotFound desc = could not find container \"eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0\": container with ID starting with eda1bcd5349cdffae660202a5e0caf9fd262d2d6932d455da1246ecc192fe4e0 not found: ID does not exist" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.871064 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.871129 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.884475 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.885061 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-run-httpd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.885171 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-scripts\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.885203 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.885260 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-798nd\" (UniqueName: \"kubernetes.io/projected/040deb18-257b-4642-8df3-2d7da1389ce6-kube-api-access-798nd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.885344 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.885411 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-config-data\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.885463 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-log-httpd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.901897 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.939041 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987291 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987350 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-run-httpd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987450 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-scripts\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987471 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987506 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-798nd\" (UniqueName: \"kubernetes.io/projected/040deb18-257b-4642-8df3-2d7da1389ce6-kube-api-access-798nd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987567 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987617 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-config-data\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.987668 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-log-httpd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.989152 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-run-httpd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.989384 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-log-httpd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.994291 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-config-data\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:04 crc kubenswrapper[4725]: I0225 11:14:04.997032 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.001656 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.012502 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.015567 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-scripts\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.021887 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-798nd\" (UniqueName: \"kubernetes.io/projected/040deb18-257b-4642-8df3-2d7da1389ce6-kube-api-access-798nd\") pod \"ceilometer-0\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " pod="openstack/ceilometer-0" Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.057897 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.235690 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e443ecd8-7a45-4762-866f-f682b221da44" path="/var/lib/kubelet/pods/e443ecd8-7a45-4762-866f-f682b221da44/volumes" Feb 25 11:14:05 crc kubenswrapper[4725]: W0225 11:14:05.524116 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod040deb18_257b_4642_8df3_2d7da1389ce6.slice/crio-74561fbedfe6cac9bdae45010488bd68c5945e07dab86c6aed334f67efdccb4d WatchSource:0}: Error finding container 74561fbedfe6cac9bdae45010488bd68c5945e07dab86c6aed334f67efdccb4d: Status 404 returned error can't find the container with id 74561fbedfe6cac9bdae45010488bd68c5945e07dab86c6aed334f67efdccb4d Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.542103 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.632411 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerStarted","Data":"74561fbedfe6cac9bdae45010488bd68c5945e07dab86c6aed334f67efdccb4d"} Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.635343 4725 generic.go:334] "Generic (PLEG): container finished" podID="532209c0-1111-4779-9620-732c8d611e1c" containerID="0f1f3c64366c6ea77197b031bf340956d44d6efe1cac7cb78e09e8e0f77ea6d9" exitCode=0 Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.635415 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533634-rq22r" event={"ID":"532209c0-1111-4779-9620-732c8d611e1c","Type":"ContainerDied","Data":"0f1f3c64366c6ea77197b031bf340956d44d6efe1cac7cb78e09e8e0f77ea6d9"} Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.638479 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:05 crc kubenswrapper[4725]: I0225 11:14:05.638520 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:06 crc kubenswrapper[4725]: I0225 11:14:06.497921 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 11:14:06 crc kubenswrapper[4725]: I0225 11:14:06.547594 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.052884 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533634-rq22r" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.124257 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwnsw\" (UniqueName: \"kubernetes.io/projected/532209c0-1111-4779-9620-732c8d611e1c-kube-api-access-vwnsw\") pod \"532209c0-1111-4779-9620-732c8d611e1c\" (UID: \"532209c0-1111-4779-9620-732c8d611e1c\") " Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.129640 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532209c0-1111-4779-9620-732c8d611e1c-kube-api-access-vwnsw" (OuterVolumeSpecName: "kube-api-access-vwnsw") pod "532209c0-1111-4779-9620-732c8d611e1c" (UID: "532209c0-1111-4779-9620-732c8d611e1c"). InnerVolumeSpecName "kube-api-access-vwnsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.225690 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwnsw\" (UniqueName: \"kubernetes.io/projected/532209c0-1111-4779-9620-732c8d611e1c-kube-api-access-vwnsw\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.627440 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.674230 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533634-rq22r" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.674360 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.674413 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533634-rq22r" event={"ID":"532209c0-1111-4779-9620-732c8d611e1c","Type":"ContainerDied","Data":"4daff68be7e5b6ac625bdbdaa205cdc527f1d19c31c627cebd5599ccd44a07dc"} Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.674439 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4daff68be7e5b6ac625bdbdaa205cdc527f1d19c31c627cebd5599ccd44a07dc" Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.677011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerStarted","Data":"ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a"} Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.677089 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerStarted","Data":"12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99"} Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.736350 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533628-ghwlf"] Feb 25 11:14:07 crc kubenswrapper[4725]: I0225 11:14:07.766139 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533628-ghwlf"] Feb 25 11:14:08 crc kubenswrapper[4725]: I0225 11:14:08.688347 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerStarted","Data":"0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23"} Feb 25 11:14:09 crc kubenswrapper[4725]: I0225 11:14:09.234596 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c2271b-a00c-41f0-a976-ce403bdb24ae" path="/var/lib/kubelet/pods/b9c2271b-a00c-41f0-a976-ce403bdb24ae/volumes" Feb 25 11:14:10 crc kubenswrapper[4725]: I0225 11:14:10.713521 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerStarted","Data":"9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb"} Feb 25 11:14:10 crc kubenswrapper[4725]: I0225 11:14:10.714548 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 11:14:10 crc kubenswrapper[4725]: I0225 11:14:10.741629 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4547667779999998 podStartE2EDuration="6.741606096s" podCreationTimestamp="2026-02-25 11:14:04 +0000 UTC" firstStartedPulling="2026-02-25 11:14:05.526856736 +0000 UTC m=+1271.025438771" lastFinishedPulling="2026-02-25 11:14:09.813696054 +0000 UTC m=+1275.312278089" observedRunningTime="2026-02-25 11:14:10.735718958 +0000 UTC m=+1276.234300983" watchObservedRunningTime="2026-02-25 11:14:10.741606096 +0000 UTC m=+1276.240188131" Feb 25 11:14:15 crc kubenswrapper[4725]: I0225 11:14:15.771121 4725 generic.go:334] "Generic (PLEG): container finished" podID="757bb635-edf2-4081-a9a1-fdc66588e0aa" containerID="c9bf8f113504797681a6e783d046c1d61dae84c398e454734223e6fd8ce114b7" exitCode=0 Feb 25 11:14:15 crc kubenswrapper[4725]: I0225 11:14:15.771192 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grc9g" event={"ID":"757bb635-edf2-4081-a9a1-fdc66588e0aa","Type":"ContainerDied","Data":"c9bf8f113504797681a6e783d046c1d61dae84c398e454734223e6fd8ce114b7"} Feb 25 11:14:16 crc kubenswrapper[4725]: I0225 11:14:16.491726 4725 scope.go:117] "RemoveContainer" containerID="84d1d9149f6f72c7e26d69054b4a1c741ccdcebea3256a545f06bfb0edf268c2" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.215778 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.328222 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-scripts\") pod \"757bb635-edf2-4081-a9a1-fdc66588e0aa\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.328440 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-config-data\") pod \"757bb635-edf2-4081-a9a1-fdc66588e0aa\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.328506 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdwll\" (UniqueName: \"kubernetes.io/projected/757bb635-edf2-4081-a9a1-fdc66588e0aa-kube-api-access-cdwll\") pod \"757bb635-edf2-4081-a9a1-fdc66588e0aa\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.328560 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-combined-ca-bundle\") pod \"757bb635-edf2-4081-a9a1-fdc66588e0aa\" (UID: \"757bb635-edf2-4081-a9a1-fdc66588e0aa\") " Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.335770 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757bb635-edf2-4081-a9a1-fdc66588e0aa-kube-api-access-cdwll" (OuterVolumeSpecName: "kube-api-access-cdwll") pod "757bb635-edf2-4081-a9a1-fdc66588e0aa" (UID: "757bb635-edf2-4081-a9a1-fdc66588e0aa"). InnerVolumeSpecName "kube-api-access-cdwll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.336349 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-scripts" (OuterVolumeSpecName: "scripts") pod "757bb635-edf2-4081-a9a1-fdc66588e0aa" (UID: "757bb635-edf2-4081-a9a1-fdc66588e0aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.359550 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-config-data" (OuterVolumeSpecName: "config-data") pod "757bb635-edf2-4081-a9a1-fdc66588e0aa" (UID: "757bb635-edf2-4081-a9a1-fdc66588e0aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.361390 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "757bb635-edf2-4081-a9a1-fdc66588e0aa" (UID: "757bb635-edf2-4081-a9a1-fdc66588e0aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.430969 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.431673 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.431843 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757bb635-edf2-4081-a9a1-fdc66588e0aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.431950 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdwll\" (UniqueName: \"kubernetes.io/projected/757bb635-edf2-4081-a9a1-fdc66588e0aa-kube-api-access-cdwll\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.805235 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-grc9g" event={"ID":"757bb635-edf2-4081-a9a1-fdc66588e0aa","Type":"ContainerDied","Data":"e9a30accf103bf929950845481713438297bb742f98a6200849e84ce14602eb5"} Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.805281 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a30accf103bf929950845481713438297bb742f98a6200849e84ce14602eb5" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.805344 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-grc9g" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.900446 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 11:14:17 crc kubenswrapper[4725]: E0225 11:14:17.900851 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532209c0-1111-4779-9620-732c8d611e1c" containerName="oc" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.900869 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="532209c0-1111-4779-9620-732c8d611e1c" containerName="oc" Feb 25 11:14:17 crc kubenswrapper[4725]: E0225 11:14:17.900893 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757bb635-edf2-4081-a9a1-fdc66588e0aa" containerName="nova-cell0-conductor-db-sync" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.900901 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="757bb635-edf2-4081-a9a1-fdc66588e0aa" containerName="nova-cell0-conductor-db-sync" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.901070 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="532209c0-1111-4779-9620-732c8d611e1c" containerName="oc" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.901098 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="757bb635-edf2-4081-a9a1-fdc66588e0aa" containerName="nova-cell0-conductor-db-sync" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.901651 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.903948 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.904176 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rp7gg" Feb 25 11:14:17 crc kubenswrapper[4725]: I0225 11:14:17.917982 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.042446 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.042532 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw8nm\" (UniqueName: \"kubernetes.io/projected/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-kube-api-access-fw8nm\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.042610 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.144355 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw8nm\" (UniqueName: \"kubernetes.io/projected/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-kube-api-access-fw8nm\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.144675 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.144893 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.152885 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.160421 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.161087 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw8nm\" (UniqueName: \"kubernetes.io/projected/96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6-kube-api-access-fw8nm\") pod \"nova-cell0-conductor-0\" (UID: \"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6\") " pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.228025 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.692397 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 11:14:18 crc kubenswrapper[4725]: I0225 11:14:18.821897 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6","Type":"ContainerStarted","Data":"fd14c9e5bb4c9deac6f9de3eb1142b4651750b85358f97107c98982c3660d973"} Feb 25 11:14:19 crc kubenswrapper[4725]: I0225 11:14:19.837282 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6","Type":"ContainerStarted","Data":"5f742a90ed1c419d77e18267e4fd6fc0d743926e0d79b4b7d1b29dcef6125873"} Feb 25 11:14:19 crc kubenswrapper[4725]: I0225 11:14:19.837765 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:19 crc kubenswrapper[4725]: I0225 11:14:19.873139 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.873115517 podStartE2EDuration="2.873115517s" podCreationTimestamp="2026-02-25 11:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:19.859929603 +0000 UTC m=+1285.358511668" watchObservedRunningTime="2026-02-25 11:14:19.873115517 +0000 UTC m=+1285.371697552" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.282482 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.851680 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7xdft"] Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.854043 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.859159 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.859360 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.884570 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7xdft"] Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.946870 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbrz\" (UniqueName: \"kubernetes.io/projected/1d6ec572-732a-4118-bbd3-88295c5173da-kube-api-access-nhbrz\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.946971 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-scripts\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.947035 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.947066 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-config-data\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.981968 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.985644 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:23 crc kubenswrapper[4725]: I0225 11:14:23.989287 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.011718 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.049724 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.049853 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-scripts\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.049898 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.049955 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.049991 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-config-data\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.050024 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbqj2\" (UniqueName: \"kubernetes.io/projected/f908cbdf-92d0-4356-8139-2919a723a457-kube-api-access-dbqj2\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.050052 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbrz\" (UniqueName: \"kubernetes.io/projected/1d6ec572-732a-4118-bbd3-88295c5173da-kube-api-access-nhbrz\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.061870 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.063867 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.066238 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-scripts\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.066274 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.066898 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.071867 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-config-data\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.075546 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbrz\" (UniqueName: \"kubernetes.io/projected/1d6ec572-732a-4118-bbd3-88295c5173da-kube-api-access-nhbrz\") pod \"nova-cell0-cell-mapping-7xdft\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.115751 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.151355 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.151403 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.151436 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmmh4\" (UniqueName: \"kubernetes.io/projected/e774709e-e8b9-420f-a2f0-1032219b0766-kube-api-access-gmmh4\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.151506 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbqj2\" (UniqueName: \"kubernetes.io/projected/f908cbdf-92d0-4356-8139-2919a723a457-kube-api-access-dbqj2\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.151533 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.151600 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-config-data\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.168546 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.169061 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.191328 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.193511 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.193650 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbqj2\" (UniqueName: \"kubernetes.io/projected/f908cbdf-92d0-4356-8139-2919a723a457-kube-api-access-dbqj2\") pod \"nova-cell1-novncproxy-0\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.195056 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.198708 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.234691 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.253007 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.253101 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aed3e04-ad7a-491d-a970-c7ec664799b8-logs\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.253126 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-config-data\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.253165 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.253189 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6694\" (UniqueName: \"kubernetes.io/projected/1aed3e04-ad7a-491d-a970-c7ec664799b8-kube-api-access-t6694\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.253236 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-config-data\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.253258 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmmh4\" (UniqueName: \"kubernetes.io/projected/e774709e-e8b9-420f-a2f0-1032219b0766-kube-api-access-gmmh4\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.261936 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-config-data\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.271869 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-h2f8n"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.273300 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.281879 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.284478 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmmh4\" (UniqueName: \"kubernetes.io/projected/e774709e-e8b9-420f-a2f0-1032219b0766-kube-api-access-gmmh4\") pod \"nova-scheduler-0\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.299103 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-h2f8n"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.306611 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.328787 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.394881 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395060 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aed3e04-ad7a-491d-a970-c7ec664799b8-logs\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395141 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69hq\" (UniqueName: \"kubernetes.io/projected/b356c2f5-ae04-4c30-932f-b0919fa9340c-kube-api-access-z69hq\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395189 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6694\" (UniqueName: \"kubernetes.io/projected/1aed3e04-ad7a-491d-a970-c7ec664799b8-kube-api-access-t6694\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395214 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-config-data\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395265 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395302 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-config\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395343 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395365 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.395419 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.396490 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aed3e04-ad7a-491d-a970-c7ec664799b8-logs\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.398284 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.407720 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-config-data\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.436549 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6694\" (UniqueName: \"kubernetes.io/projected/1aed3e04-ad7a-491d-a970-c7ec664799b8-kube-api-access-t6694\") pod \"nova-metadata-0\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.487104 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.488583 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.496950 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.497015 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-config\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.497049 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.497079 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.497134 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.497247 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69hq\" (UniqueName: \"kubernetes.io/projected/b356c2f5-ae04-4c30-932f-b0919fa9340c-kube-api-access-z69hq\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.499123 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.499548 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.499780 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-config\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.499941 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.500529 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.500871 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.501078 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.525703 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69hq\" (UniqueName: \"kubernetes.io/projected/b356c2f5-ae04-4c30-932f-b0919fa9340c-kube-api-access-z69hq\") pod \"dnsmasq-dns-757b4f8459-h2f8n\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.600961 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjddt\" (UniqueName: \"kubernetes.io/projected/a066e46e-d22c-4abf-bd26-653e283efc51-kube-api-access-mjddt\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.601032 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-config-data\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.601079 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.601147 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a066e46e-d22c-4abf-bd26-653e283efc51-logs\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.616859 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.624327 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.703058 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.703181 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a066e46e-d22c-4abf-bd26-653e283efc51-logs\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.703273 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjddt\" (UniqueName: \"kubernetes.io/projected/a066e46e-d22c-4abf-bd26-653e283efc51-kube-api-access-mjddt\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.703328 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-config-data\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.704898 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a066e46e-d22c-4abf-bd26-653e283efc51-logs\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.709610 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.714443 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-config-data\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.726405 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjddt\" (UniqueName: \"kubernetes.io/projected/a066e46e-d22c-4abf-bd26-653e283efc51-kube-api-access-mjddt\") pod \"nova-api-0\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.822706 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.875753 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7xdft"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.885791 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:24 crc kubenswrapper[4725]: I0225 11:14:24.986601 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.015357 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zphmq"] Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.016874 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.019665 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.019758 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.027097 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zphmq"] Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.116129 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.116217 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcjtr\" (UniqueName: \"kubernetes.io/projected/9253f776-9f91-4908-95a0-1f495326291d-kube-api-access-qcjtr\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.116266 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-config-data\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.116326 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-scripts\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.215586 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-h2f8n"] Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.227332 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcjtr\" (UniqueName: \"kubernetes.io/projected/9253f776-9f91-4908-95a0-1f495326291d-kube-api-access-qcjtr\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.228056 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-config-data\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.247079 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-config-data\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.248608 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-scripts\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.248817 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.254768 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.257803 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-scripts\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.267385 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcjtr\" (UniqueName: \"kubernetes.io/projected/9253f776-9f91-4908-95a0-1f495326291d-kube-api-access-qcjtr\") pod \"nova-cell1-conductor-db-sync-zphmq\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.271184 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.353264 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.437085 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.901457 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f908cbdf-92d0-4356-8139-2919a723a457","Type":"ContainerStarted","Data":"9a2cf3a7ae2630ec4cfd70193903c0c4610c8831f114bf82bd51072311dd7565"} Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.904371 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7xdft" event={"ID":"1d6ec572-732a-4118-bbd3-88295c5173da","Type":"ContainerStarted","Data":"122fa33a6fb946db18a1343242130a1b974362a2da0c3352409e5fcd2a6858f2"} Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.904427 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7xdft" event={"ID":"1d6ec572-732a-4118-bbd3-88295c5173da","Type":"ContainerStarted","Data":"92e186a8c1bf9c25d125fcd301e4660c15b39db9b155e7673fa1da19aea39a94"} Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.909774 4725 generic.go:334] "Generic (PLEG): container finished" podID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerID="65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b" exitCode=0 Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.909841 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" event={"ID":"b356c2f5-ae04-4c30-932f-b0919fa9340c","Type":"ContainerDied","Data":"65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b"} Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.909860 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" event={"ID":"b356c2f5-ae04-4c30-932f-b0919fa9340c","Type":"ContainerStarted","Data":"4a48797deb687f499501922852a52ab7a867f709395167a6778fc37da01d9eba"} Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.917337 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e774709e-e8b9-420f-a2f0-1032219b0766","Type":"ContainerStarted","Data":"a5916b18d50d01a26343680515b44adf7e2901400acb0d257bdd12413c713d32"} Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.929528 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7xdft" podStartSLOduration=2.929509902 podStartE2EDuration="2.929509902s" podCreationTimestamp="2026-02-25 11:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:25.925633788 +0000 UTC m=+1291.424215833" watchObservedRunningTime="2026-02-25 11:14:25.929509902 +0000 UTC m=+1291.428091927" Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.932969 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aed3e04-ad7a-491d-a970-c7ec664799b8","Type":"ContainerStarted","Data":"cd51f9fb48de1b418b85b5e33019e40e1345645096b73b8d5f3a2d79ced1c4f0"} Feb 25 11:14:25 crc kubenswrapper[4725]: I0225 11:14:25.934954 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a066e46e-d22c-4abf-bd26-653e283efc51","Type":"ContainerStarted","Data":"ae75bb126f16c23e9741836ec0e1a52d2cc828fb19ff57abb3416b82f0707bb5"} Feb 25 11:14:26 crc kubenswrapper[4725]: I0225 11:14:26.024844 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zphmq"] Feb 25 11:14:26 crc kubenswrapper[4725]: I0225 11:14:26.948796 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zphmq" event={"ID":"9253f776-9f91-4908-95a0-1f495326291d","Type":"ContainerStarted","Data":"03d3d725bf3dc66aced72b53625686ca64039eba98fc55c862ea094d7439551e"} Feb 25 11:14:26 crc kubenswrapper[4725]: I0225 11:14:26.949088 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zphmq" event={"ID":"9253f776-9f91-4908-95a0-1f495326291d","Type":"ContainerStarted","Data":"90af7173a3537834fdc8c6ce7c09f5ad603d0fc5b2521bf5b8eac67ccfa3650f"} Feb 25 11:14:26 crc kubenswrapper[4725]: I0225 11:14:26.951182 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" event={"ID":"b356c2f5-ae04-4c30-932f-b0919fa9340c","Type":"ContainerStarted","Data":"9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c"} Feb 25 11:14:26 crc kubenswrapper[4725]: I0225 11:14:26.951319 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:26 crc kubenswrapper[4725]: I0225 11:14:26.969726 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zphmq" podStartSLOduration=2.969709497 podStartE2EDuration="2.969709497s" podCreationTimestamp="2026-02-25 11:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:26.960535541 +0000 UTC m=+1292.459117566" watchObservedRunningTime="2026-02-25 11:14:26.969709497 +0000 UTC m=+1292.468291522" Feb 25 11:14:26 crc kubenswrapper[4725]: I0225 11:14:26.989260 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" podStartSLOduration=2.989240802 podStartE2EDuration="2.989240802s" podCreationTimestamp="2026-02-25 11:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:26.979677955 +0000 UTC m=+1292.478259990" watchObservedRunningTime="2026-02-25 11:14:26.989240802 +0000 UTC m=+1292.487822827" Feb 25 11:14:27 crc kubenswrapper[4725]: I0225 11:14:27.899974 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:27 crc kubenswrapper[4725]: I0225 11:14:27.915332 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.969141 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aed3e04-ad7a-491d-a970-c7ec664799b8","Type":"ContainerStarted","Data":"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403"} Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.969389 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aed3e04-ad7a-491d-a970-c7ec664799b8","Type":"ContainerStarted","Data":"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125"} Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.969233 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-log" containerID="cri-o://a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125" gracePeriod=30 Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.969504 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-metadata" containerID="cri-o://b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403" gracePeriod=30 Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.970883 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a066e46e-d22c-4abf-bd26-653e283efc51","Type":"ContainerStarted","Data":"d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761"} Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.970913 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a066e46e-d22c-4abf-bd26-653e283efc51","Type":"ContainerStarted","Data":"37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243"} Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.974125 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f908cbdf-92d0-4356-8139-2919a723a457","Type":"ContainerStarted","Data":"4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8"} Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.974277 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f908cbdf-92d0-4356-8139-2919a723a457" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8" gracePeriod=30 Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.975635 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e774709e-e8b9-420f-a2f0-1032219b0766","Type":"ContainerStarted","Data":"705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755"} Feb 25 11:14:28 crc kubenswrapper[4725]: I0225 11:14:28.992202 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.098485189 podStartE2EDuration="4.992183005s" podCreationTimestamp="2026-02-25 11:14:24 +0000 UTC" firstStartedPulling="2026-02-25 11:14:25.258936379 +0000 UTC m=+1290.757518404" lastFinishedPulling="2026-02-25 11:14:28.152634195 +0000 UTC m=+1293.651216220" observedRunningTime="2026-02-25 11:14:28.989222555 +0000 UTC m=+1294.487804580" watchObservedRunningTime="2026-02-25 11:14:28.992183005 +0000 UTC m=+1294.490765020" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.010341 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.85864222 podStartE2EDuration="6.010323442s" podCreationTimestamp="2026-02-25 11:14:23 +0000 UTC" firstStartedPulling="2026-02-25 11:14:25.002259608 +0000 UTC m=+1290.500841633" lastFinishedPulling="2026-02-25 11:14:28.15394083 +0000 UTC m=+1293.652522855" observedRunningTime="2026-02-25 11:14:29.001107654 +0000 UTC m=+1294.499689679" watchObservedRunningTime="2026-02-25 11:14:29.010323442 +0000 UTC m=+1294.508905467" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.030951 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.329143671 podStartE2EDuration="5.030934425s" podCreationTimestamp="2026-02-25 11:14:24 +0000 UTC" firstStartedPulling="2026-02-25 11:14:25.482184923 +0000 UTC m=+1290.980766948" lastFinishedPulling="2026-02-25 11:14:28.183975677 +0000 UTC m=+1293.682557702" observedRunningTime="2026-02-25 11:14:29.022517139 +0000 UTC m=+1294.521099164" watchObservedRunningTime="2026-02-25 11:14:29.030934425 +0000 UTC m=+1294.529516450" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.045909 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.813182329 podStartE2EDuration="5.045891387s" podCreationTimestamp="2026-02-25 11:14:24 +0000 UTC" firstStartedPulling="2026-02-25 11:14:24.921177851 +0000 UTC m=+1290.419759876" lastFinishedPulling="2026-02-25 11:14:28.153886919 +0000 UTC m=+1293.652468934" observedRunningTime="2026-02-25 11:14:29.037910512 +0000 UTC m=+1294.536492557" watchObservedRunningTime="2026-02-25 11:14:29.045891387 +0000 UTC m=+1294.544473412" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.307871 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.330001 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.618456 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.618505 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.633295 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.788627 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-config-data\") pod \"1aed3e04-ad7a-491d-a970-c7ec664799b8\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.789025 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-combined-ca-bundle\") pod \"1aed3e04-ad7a-491d-a970-c7ec664799b8\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.789106 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aed3e04-ad7a-491d-a970-c7ec664799b8-logs\") pod \"1aed3e04-ad7a-491d-a970-c7ec664799b8\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.789241 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6694\" (UniqueName: \"kubernetes.io/projected/1aed3e04-ad7a-491d-a970-c7ec664799b8-kube-api-access-t6694\") pod \"1aed3e04-ad7a-491d-a970-c7ec664799b8\" (UID: \"1aed3e04-ad7a-491d-a970-c7ec664799b8\") " Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.789980 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aed3e04-ad7a-491d-a970-c7ec664799b8-logs" (OuterVolumeSpecName: "logs") pod "1aed3e04-ad7a-491d-a970-c7ec664799b8" (UID: "1aed3e04-ad7a-491d-a970-c7ec664799b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.793600 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aed3e04-ad7a-491d-a970-c7ec664799b8-kube-api-access-t6694" (OuterVolumeSpecName: "kube-api-access-t6694") pod "1aed3e04-ad7a-491d-a970-c7ec664799b8" (UID: "1aed3e04-ad7a-491d-a970-c7ec664799b8"). InnerVolumeSpecName "kube-api-access-t6694". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.818185 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-config-data" (OuterVolumeSpecName: "config-data") pod "1aed3e04-ad7a-491d-a970-c7ec664799b8" (UID: "1aed3e04-ad7a-491d-a970-c7ec664799b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.820487 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aed3e04-ad7a-491d-a970-c7ec664799b8" (UID: "1aed3e04-ad7a-491d-a970-c7ec664799b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.891811 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6694\" (UniqueName: \"kubernetes.io/projected/1aed3e04-ad7a-491d-a970-c7ec664799b8-kube-api-access-t6694\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.891872 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.891884 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aed3e04-ad7a-491d-a970-c7ec664799b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.891898 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aed3e04-ad7a-491d-a970-c7ec664799b8-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.987968 4725 generic.go:334] "Generic (PLEG): container finished" podID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerID="b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403" exitCode=0 Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.988043 4725 generic.go:334] "Generic (PLEG): container finished" podID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerID="a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125" exitCode=143 Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.988029 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.988128 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aed3e04-ad7a-491d-a970-c7ec664799b8","Type":"ContainerDied","Data":"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403"} Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.988173 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aed3e04-ad7a-491d-a970-c7ec664799b8","Type":"ContainerDied","Data":"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125"} Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.988186 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1aed3e04-ad7a-491d-a970-c7ec664799b8","Type":"ContainerDied","Data":"cd51f9fb48de1b418b85b5e33019e40e1345645096b73b8d5f3a2d79ced1c4f0"} Feb 25 11:14:29 crc kubenswrapper[4725]: I0225 11:14:29.988204 4725 scope.go:117] "RemoveContainer" containerID="b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.013279 4725 scope.go:117] "RemoveContainer" containerID="a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.042307 4725 scope.go:117] "RemoveContainer" containerID="b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403" Feb 25 11:14:30 crc kubenswrapper[4725]: E0225 11:14:30.042747 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403\": container with ID starting with b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403 not found: ID does not exist" containerID="b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.042794 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403"} err="failed to get container status \"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403\": rpc error: code = NotFound desc = could not find container \"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403\": container with ID starting with b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403 not found: ID does not exist" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.042819 4725 scope.go:117] "RemoveContainer" containerID="a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125" Feb 25 11:14:30 crc kubenswrapper[4725]: E0225 11:14:30.043131 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125\": container with ID starting with a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125 not found: ID does not exist" containerID="a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.043160 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125"} err="failed to get container status \"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125\": rpc error: code = NotFound desc = could not find container \"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125\": container with ID starting with a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125 not found: ID does not exist" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.043178 4725 scope.go:117] "RemoveContainer" containerID="b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.045196 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403"} err="failed to get container status \"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403\": rpc error: code = NotFound desc = could not find container \"b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403\": container with ID starting with b07a92f896522c9198c629b71599b29c892486b9933943c3c23cefe958ae5403 not found: ID does not exist" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.045252 4725 scope.go:117] "RemoveContainer" containerID="a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.046256 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125"} err="failed to get container status \"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125\": rpc error: code = NotFound desc = could not find container \"a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125\": container with ID starting with a7347e509d6e79a5931f6be13c1b98bce26f9e4e108f053fb741d6a98072e125 not found: ID does not exist" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.056804 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.066268 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.075688 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:30 crc kubenswrapper[4725]: E0225 11:14:30.076410 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-log" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.076498 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-log" Feb 25 11:14:30 crc kubenswrapper[4725]: E0225 11:14:30.076621 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-metadata" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.076696 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-metadata" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.077009 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-log" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.077106 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" containerName="nova-metadata-metadata" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.078165 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.086624 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.086692 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.086743 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.196990 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkmwx\" (UniqueName: \"kubernetes.io/projected/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-kube-api-access-rkmwx\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.197038 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.197083 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-logs\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.197111 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-config-data\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.197129 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.299688 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkmwx\" (UniqueName: \"kubernetes.io/projected/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-kube-api-access-rkmwx\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.299736 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.299775 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-logs\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.299805 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-config-data\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.299865 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.300222 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-logs\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.303398 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.306348 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.310437 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-config-data\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.333336 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkmwx\" (UniqueName: \"kubernetes.io/projected/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-kube-api-access-rkmwx\") pod \"nova-metadata-0\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.403221 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.880538 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:30 crc kubenswrapper[4725]: I0225 11:14:30.998227 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c63c14a-b917-47b5-b9ee-02a7ef7698f1","Type":"ContainerStarted","Data":"147c208d6206552b0c9a3dab6a4f4ab96e29e9f5ccbac9702162fe7932840fe2"} Feb 25 11:14:31 crc kubenswrapper[4725]: I0225 11:14:31.238418 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aed3e04-ad7a-491d-a970-c7ec664799b8" path="/var/lib/kubelet/pods/1aed3e04-ad7a-491d-a970-c7ec664799b8/volumes" Feb 25 11:14:32 crc kubenswrapper[4725]: I0225 11:14:32.013814 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c63c14a-b917-47b5-b9ee-02a7ef7698f1","Type":"ContainerStarted","Data":"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e"} Feb 25 11:14:32 crc kubenswrapper[4725]: I0225 11:14:32.014395 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c63c14a-b917-47b5-b9ee-02a7ef7698f1","Type":"ContainerStarted","Data":"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71"} Feb 25 11:14:32 crc kubenswrapper[4725]: I0225 11:14:32.053647 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.053617274 podStartE2EDuration="2.053617274s" podCreationTimestamp="2026-02-25 11:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:32.040696167 +0000 UTC m=+1297.539278272" watchObservedRunningTime="2026-02-25 11:14:32.053617274 +0000 UTC m=+1297.552199339" Feb 25 11:14:33 crc kubenswrapper[4725]: I0225 11:14:33.030272 4725 generic.go:334] "Generic (PLEG): container finished" podID="1d6ec572-732a-4118-bbd3-88295c5173da" containerID="122fa33a6fb946db18a1343242130a1b974362a2da0c3352409e5fcd2a6858f2" exitCode=0 Feb 25 11:14:33 crc kubenswrapper[4725]: I0225 11:14:33.030988 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7xdft" event={"ID":"1d6ec572-732a-4118-bbd3-88295c5173da","Type":"ContainerDied","Data":"122fa33a6fb946db18a1343242130a1b974362a2da0c3352409e5fcd2a6858f2"} Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.045939 4725 generic.go:334] "Generic (PLEG): container finished" podID="9253f776-9f91-4908-95a0-1f495326291d" containerID="03d3d725bf3dc66aced72b53625686ca64039eba98fc55c862ea094d7439551e" exitCode=0 Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.046011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zphmq" event={"ID":"9253f776-9f91-4908-95a0-1f495326291d","Type":"ContainerDied","Data":"03d3d725bf3dc66aced72b53625686ca64039eba98fc55c862ea094d7439551e"} Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.308036 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.337359 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.471012 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.588333 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-combined-ca-bundle\") pod \"1d6ec572-732a-4118-bbd3-88295c5173da\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.588703 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhbrz\" (UniqueName: \"kubernetes.io/projected/1d6ec572-732a-4118-bbd3-88295c5173da-kube-api-access-nhbrz\") pod \"1d6ec572-732a-4118-bbd3-88295c5173da\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.588764 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-scripts\") pod \"1d6ec572-732a-4118-bbd3-88295c5173da\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.588815 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-config-data\") pod \"1d6ec572-732a-4118-bbd3-88295c5173da\" (UID: \"1d6ec572-732a-4118-bbd3-88295c5173da\") " Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.594952 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6ec572-732a-4118-bbd3-88295c5173da-kube-api-access-nhbrz" (OuterVolumeSpecName: "kube-api-access-nhbrz") pod "1d6ec572-732a-4118-bbd3-88295c5173da" (UID: "1d6ec572-732a-4118-bbd3-88295c5173da"). InnerVolumeSpecName "kube-api-access-nhbrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.609139 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-scripts" (OuterVolumeSpecName: "scripts") pod "1d6ec572-732a-4118-bbd3-88295c5173da" (UID: "1d6ec572-732a-4118-bbd3-88295c5173da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.626289 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.626739 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-config-data" (OuterVolumeSpecName: "config-data") pod "1d6ec572-732a-4118-bbd3-88295c5173da" (UID: "1d6ec572-732a-4118-bbd3-88295c5173da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.640569 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d6ec572-732a-4118-bbd3-88295c5173da" (UID: "1d6ec572-732a-4118-bbd3-88295c5173da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.691218 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-4ntbw"] Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.691363 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhbrz\" (UniqueName: \"kubernetes.io/projected/1d6ec572-732a-4118-bbd3-88295c5173da-kube-api-access-nhbrz\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.691411 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.691432 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.691451 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6ec572-732a-4118-bbd3-88295c5173da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.693783 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" podUID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerName="dnsmasq-dns" containerID="cri-o://38e21351c8c08b2e6efe54354e4ca9e6b31f36c834840273944f7410012695bb" gracePeriod=10 Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.824093 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 11:14:34 crc kubenswrapper[4725]: I0225 11:14:34.824143 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.064849 4725 generic.go:334] "Generic (PLEG): container finished" podID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerID="38e21351c8c08b2e6efe54354e4ca9e6b31f36c834840273944f7410012695bb" exitCode=0 Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.064930 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" event={"ID":"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5","Type":"ContainerDied","Data":"38e21351c8c08b2e6efe54354e4ca9e6b31f36c834840273944f7410012695bb"} Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.074601 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.083641 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7xdft" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.090952 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7xdft" event={"ID":"1d6ec572-732a-4118-bbd3-88295c5173da","Type":"ContainerDied","Data":"92e186a8c1bf9c25d125fcd301e4660c15b39db9b155e7673fa1da19aea39a94"} Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.090991 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92e186a8c1bf9c25d125fcd301e4660c15b39db9b155e7673fa1da19aea39a94" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.195317 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.212321 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.303109 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-nb\") pod \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.303299 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-swift-storage-0\") pod \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.303349 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-sb\") pod \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.303372 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config\") pod \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.303390 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-svc\") pod \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.303412 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2djrx\" (UniqueName: \"kubernetes.io/projected/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-kube-api-access-2djrx\") pod \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.309927 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.310234 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-log" containerID="cri-o://37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243" gracePeriod=30 Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.310764 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-api" containerID="cri-o://d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761" gracePeriod=30 Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.312257 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-kube-api-access-2djrx" (OuterVolumeSpecName: "kube-api-access-2djrx") pod "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" (UID: "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5"). InnerVolumeSpecName "kube-api-access-2djrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.327743 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": EOF" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.327746 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": EOF" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.328153 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.330746 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-log" containerID="cri-o://c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71" gracePeriod=30 Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.330908 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-metadata" containerID="cri-o://a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e" gracePeriod=30 Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.412848 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.412893 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.413470 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" (UID: "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.413544 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" (UID: "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.414212 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.414240 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.414249 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2djrx\" (UniqueName: \"kubernetes.io/projected/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-kube-api-access-2djrx\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.419344 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" (UID: "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: E0225 11:14:35.419576 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config podName:fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5 nodeName:}" failed. No retries permitted until 2026-02-25 11:14:35.919552998 +0000 UTC m=+1301.418135023 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config") pod "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" (UID: "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5") : error deleting /var/lib/kubelet/pods/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5/volume-subpaths: remove /var/lib/kubelet/pods/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5/volume-subpaths: no such file or directory Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.421294 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" (UID: "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.516938 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.517077 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.681593 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.828477 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-config-data\") pod \"9253f776-9f91-4908-95a0-1f495326291d\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.828539 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-scripts\") pod \"9253f776-9f91-4908-95a0-1f495326291d\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.828636 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-combined-ca-bundle\") pod \"9253f776-9f91-4908-95a0-1f495326291d\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.828700 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcjtr\" (UniqueName: \"kubernetes.io/projected/9253f776-9f91-4908-95a0-1f495326291d-kube-api-access-qcjtr\") pod \"9253f776-9f91-4908-95a0-1f495326291d\" (UID: \"9253f776-9f91-4908-95a0-1f495326291d\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.832878 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-scripts" (OuterVolumeSpecName: "scripts") pod "9253f776-9f91-4908-95a0-1f495326291d" (UID: "9253f776-9f91-4908-95a0-1f495326291d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.836056 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9253f776-9f91-4908-95a0-1f495326291d-kube-api-access-qcjtr" (OuterVolumeSpecName: "kube-api-access-qcjtr") pod "9253f776-9f91-4908-95a0-1f495326291d" (UID: "9253f776-9f91-4908-95a0-1f495326291d"). InnerVolumeSpecName "kube-api-access-qcjtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.843584 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.845107 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.858286 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9253f776-9f91-4908-95a0-1f495326291d" (UID: "9253f776-9f91-4908-95a0-1f495326291d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.871072 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-config-data" (OuterVolumeSpecName: "config-data") pod "9253f776-9f91-4908-95a0-1f495326291d" (UID: "9253f776-9f91-4908-95a0-1f495326291d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.930526 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config\") pod \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\" (UID: \"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5\") " Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.931514 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config" (OuterVolumeSpecName: "config") pod "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" (UID: "fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.932087 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.932107 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.932265 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.932308 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcjtr\" (UniqueName: \"kubernetes.io/projected/9253f776-9f91-4908-95a0-1f495326291d-kube-api-access-qcjtr\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:35 crc kubenswrapper[4725]: I0225 11:14:35.932318 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9253f776-9f91-4908-95a0-1f495326291d-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.033652 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkmwx\" (UniqueName: \"kubernetes.io/projected/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-kube-api-access-rkmwx\") pod \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.033716 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-combined-ca-bundle\") pod \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.033746 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-logs\") pod \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.033949 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-config-data\") pod \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.033980 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-nova-metadata-tls-certs\") pod \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\" (UID: \"0c63c14a-b917-47b5-b9ee-02a7ef7698f1\") " Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.034731 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-logs" (OuterVolumeSpecName: "logs") pod "0c63c14a-b917-47b5-b9ee-02a7ef7698f1" (UID: "0c63c14a-b917-47b5-b9ee-02a7ef7698f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.038946 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-kube-api-access-rkmwx" (OuterVolumeSpecName: "kube-api-access-rkmwx") pod "0c63c14a-b917-47b5-b9ee-02a7ef7698f1" (UID: "0c63c14a-b917-47b5-b9ee-02a7ef7698f1"). InnerVolumeSpecName "kube-api-access-rkmwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.056318 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-config-data" (OuterVolumeSpecName: "config-data") pod "0c63c14a-b917-47b5-b9ee-02a7ef7698f1" (UID: "0c63c14a-b917-47b5-b9ee-02a7ef7698f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.062773 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c63c14a-b917-47b5-b9ee-02a7ef7698f1" (UID: "0c63c14a-b917-47b5-b9ee-02a7ef7698f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.091013 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" event={"ID":"fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5","Type":"ContainerDied","Data":"e211ae86272138c295603fc9beaf5a588b4b4ede7f7335302dd2b2169c2852c4"} Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.091242 4725 scope.go:117] "RemoveContainer" containerID="38e21351c8c08b2e6efe54354e4ca9e6b31f36c834840273944f7410012695bb" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.091458 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-4ntbw" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.129455 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zphmq" event={"ID":"9253f776-9f91-4908-95a0-1f495326291d","Type":"ContainerDied","Data":"90af7173a3537834fdc8c6ce7c09f5ad603d0fc5b2521bf5b8eac67ccfa3650f"} Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.129496 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90af7173a3537834fdc8c6ce7c09f5ad603d0fc5b2521bf5b8eac67ccfa3650f" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.129582 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zphmq" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.136998 4725 scope.go:117] "RemoveContainer" containerID="cacef5bdb7fa98cd4e0c9cc88e4fcda5f425a2637bf2280a79a21f09a0af3323" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.141615 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.141641 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkmwx\" (UniqueName: \"kubernetes.io/projected/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-kube-api-access-rkmwx\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.141651 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.141659 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.142791 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0c63c14a-b917-47b5-b9ee-02a7ef7698f1" (UID: "0c63c14a-b917-47b5-b9ee-02a7ef7698f1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.152429 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-4ntbw"] Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.157148 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerID="a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e" exitCode=0 Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.157244 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerID="c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71" exitCode=143 Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.157337 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c63c14a-b917-47b5-b9ee-02a7ef7698f1","Type":"ContainerDied","Data":"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e"} Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.157408 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c63c14a-b917-47b5-b9ee-02a7ef7698f1","Type":"ContainerDied","Data":"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71"} Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.157505 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c63c14a-b917-47b5-b9ee-02a7ef7698f1","Type":"ContainerDied","Data":"147c208d6206552b0c9a3dab6a4f4ab96e29e9f5ccbac9702162fe7932840fe2"} Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.157613 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.192137 4725 generic.go:334] "Generic (PLEG): container finished" podID="a066e46e-d22c-4abf-bd26-653e283efc51" containerID="37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243" exitCode=143 Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.193080 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a066e46e-d22c-4abf-bd26-653e283efc51","Type":"ContainerDied","Data":"37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243"} Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.197181 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-4ntbw"] Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.225963 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.226383 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-metadata" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226395 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-metadata" Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.226410 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerName="dnsmasq-dns" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226417 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerName="dnsmasq-dns" Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.226432 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerName="init" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226438 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerName="init" Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.226454 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-log" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226459 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-log" Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.226472 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9253f776-9f91-4908-95a0-1f495326291d" containerName="nova-cell1-conductor-db-sync" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226478 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9253f776-9f91-4908-95a0-1f495326291d" containerName="nova-cell1-conductor-db-sync" Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.226488 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6ec572-732a-4118-bbd3-88295c5173da" containerName="nova-manage" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226494 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6ec572-732a-4118-bbd3-88295c5173da" containerName="nova-manage" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226652 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9253f776-9f91-4908-95a0-1f495326291d" containerName="nova-cell1-conductor-db-sync" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226665 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" containerName="dnsmasq-dns" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226679 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-log" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226690 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6ec572-732a-4118-bbd3-88295c5173da" containerName="nova-manage" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.226704 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" containerName="nova-metadata-metadata" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.227315 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.232209 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.240554 4725 scope.go:117] "RemoveContainer" containerID="a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.251432 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e17e12f-d899-470f-8087-b92c47f46c5b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.251475 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjsr\" (UniqueName: \"kubernetes.io/projected/1e17e12f-d899-470f-8087-b92c47f46c5b-kube-api-access-qkjsr\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.251745 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e17e12f-d899-470f-8087-b92c47f46c5b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.252074 4725 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c63c14a-b917-47b5-b9ee-02a7ef7698f1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.265688 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.281971 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.292309 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.300537 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.302045 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.305049 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.306060 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.339704 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.343991 4725 scope.go:117] "RemoveContainer" containerID="c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.353538 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e17e12f-d899-470f-8087-b92c47f46c5b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.353680 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.353778 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.353922 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e17e12f-d899-470f-8087-b92c47f46c5b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.353992 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjsr\" (UniqueName: \"kubernetes.io/projected/1e17e12f-d899-470f-8087-b92c47f46c5b-kube-api-access-qkjsr\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.354136 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b964l\" (UniqueName: \"kubernetes.io/projected/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-kube-api-access-b964l\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.354225 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-config-data\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.354296 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-logs\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.359659 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e17e12f-d899-470f-8087-b92c47f46c5b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.361678 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e17e12f-d899-470f-8087-b92c47f46c5b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.373149 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjsr\" (UniqueName: \"kubernetes.io/projected/1e17e12f-d899-470f-8087-b92c47f46c5b-kube-api-access-qkjsr\") pod \"nova-cell1-conductor-0\" (UID: \"1e17e12f-d899-470f-8087-b92c47f46c5b\") " pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.451386 4725 scope.go:117] "RemoveContainer" containerID="a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e" Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.452114 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e\": container with ID starting with a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e not found: ID does not exist" containerID="a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.452247 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e"} err="failed to get container status \"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e\": rpc error: code = NotFound desc = could not find container \"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e\": container with ID starting with a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e not found: ID does not exist" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.452490 4725 scope.go:117] "RemoveContainer" containerID="c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71" Feb 25 11:14:36 crc kubenswrapper[4725]: E0225 11:14:36.452960 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71\": container with ID starting with c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71 not found: ID does not exist" containerID="c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.452995 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71"} err="failed to get container status \"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71\": rpc error: code = NotFound desc = could not find container \"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71\": container with ID starting with c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71 not found: ID does not exist" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.453017 4725 scope.go:117] "RemoveContainer" containerID="a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.453207 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e"} err="failed to get container status \"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e\": rpc error: code = NotFound desc = could not find container \"a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e\": container with ID starting with a763811b6ebf0bcfffba121ddbcdd59b0c42e62452cd77750bde1f079c7ae36e not found: ID does not exist" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.453227 4725 scope.go:117] "RemoveContainer" containerID="c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.453424 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71"} err="failed to get container status \"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71\": rpc error: code = NotFound desc = could not find container \"c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71\": container with ID starting with c414530dfb0664004b3f0142079948f77cf14928204360745bda8236b828eb71 not found: ID does not exist" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.456295 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.456346 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.456452 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b964l\" (UniqueName: \"kubernetes.io/projected/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-kube-api-access-b964l\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.456481 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-config-data\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.456498 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-logs\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.457335 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-logs\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.464194 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.464364 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-config-data\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.464467 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.472363 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b964l\" (UniqueName: \"kubernetes.io/projected/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-kube-api-access-b964l\") pod \"nova-metadata-0\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " pod="openstack/nova-metadata-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.566756 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:36 crc kubenswrapper[4725]: I0225 11:14:36.629084 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.038624 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 11:14:37 crc kubenswrapper[4725]: W0225 11:14:37.143202 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7facde5c_b0f0_4cbd_994c_15eb5a9ac57a.slice/crio-6aa5307de5eb33cec5c5b501c21ead13319b642d654e0936a0349776eaab9390 WatchSource:0}: Error finding container 6aa5307de5eb33cec5c5b501c21ead13319b642d654e0936a0349776eaab9390: Status 404 returned error can't find the container with id 6aa5307de5eb33cec5c5b501c21ead13319b642d654e0936a0349776eaab9390 Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.148586 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.209022 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1e17e12f-d899-470f-8087-b92c47f46c5b","Type":"ContainerStarted","Data":"9a6ab0720d2fd68791f4f6f731b2417109313e505cec023f67cad2e350959b8f"} Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.209445 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.209499 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1e17e12f-d899-470f-8087-b92c47f46c5b","Type":"ContainerStarted","Data":"6bec591f0269726fcc0c930849bdc85ce6e2b055abfd46f60a7a9adb6c5a8424"} Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.212773 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a","Type":"ContainerStarted","Data":"6aa5307de5eb33cec5c5b501c21ead13319b642d654e0936a0349776eaab9390"} Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.215926 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e774709e-e8b9-420f-a2f0-1032219b0766" containerName="nova-scheduler-scheduler" containerID="cri-o://705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755" gracePeriod=30 Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.226390 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.226350555 podStartE2EDuration="1.226350555s" podCreationTimestamp="2026-02-25 11:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:37.22469795 +0000 UTC m=+1302.723279995" watchObservedRunningTime="2026-02-25 11:14:37.226350555 +0000 UTC m=+1302.724932590" Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.241220 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c63c14a-b917-47b5-b9ee-02a7ef7698f1" path="/var/lib/kubelet/pods/0c63c14a-b917-47b5-b9ee-02a7ef7698f1/volumes" Feb 25 11:14:37 crc kubenswrapper[4725]: I0225 11:14:37.241980 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5" path="/var/lib/kubelet/pods/fc667f48-ac1c-4c0a-8e15-25c7adb7e6a5/volumes" Feb 25 11:14:38 crc kubenswrapper[4725]: I0225 11:14:38.239993 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a","Type":"ContainerStarted","Data":"4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0"} Feb 25 11:14:38 crc kubenswrapper[4725]: I0225 11:14:38.240394 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a","Type":"ContainerStarted","Data":"ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962"} Feb 25 11:14:38 crc kubenswrapper[4725]: I0225 11:14:38.278316 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.278295216 podStartE2EDuration="2.278295216s" podCreationTimestamp="2026-02-25 11:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:38.271167705 +0000 UTC m=+1303.769749730" watchObservedRunningTime="2026-02-25 11:14:38.278295216 +0000 UTC m=+1303.776877261" Feb 25 11:14:39 crc kubenswrapper[4725]: E0225 11:14:39.308668 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 25 11:14:39 crc kubenswrapper[4725]: E0225 11:14:39.310364 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 25 11:14:39 crc kubenswrapper[4725]: E0225 11:14:39.311385 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 25 11:14:39 crc kubenswrapper[4725]: E0225 11:14:39.311442 4725 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e774709e-e8b9-420f-a2f0-1032219b0766" containerName="nova-scheduler-scheduler" Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.254121 4725 generic.go:334] "Generic (PLEG): container finished" podID="e774709e-e8b9-420f-a2f0-1032219b0766" containerID="705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755" exitCode=0 Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.254240 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e774709e-e8b9-420f-a2f0-1032219b0766","Type":"ContainerDied","Data":"705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755"} Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.379683 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.438775 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-combined-ca-bundle\") pod \"e774709e-e8b9-420f-a2f0-1032219b0766\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.438884 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmmh4\" (UniqueName: \"kubernetes.io/projected/e774709e-e8b9-420f-a2f0-1032219b0766-kube-api-access-gmmh4\") pod \"e774709e-e8b9-420f-a2f0-1032219b0766\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.439064 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-config-data\") pod \"e774709e-e8b9-420f-a2f0-1032219b0766\" (UID: \"e774709e-e8b9-420f-a2f0-1032219b0766\") " Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.445500 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e774709e-e8b9-420f-a2f0-1032219b0766-kube-api-access-gmmh4" (OuterVolumeSpecName: "kube-api-access-gmmh4") pod "e774709e-e8b9-420f-a2f0-1032219b0766" (UID: "e774709e-e8b9-420f-a2f0-1032219b0766"). InnerVolumeSpecName "kube-api-access-gmmh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.465593 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-config-data" (OuterVolumeSpecName: "config-data") pod "e774709e-e8b9-420f-a2f0-1032219b0766" (UID: "e774709e-e8b9-420f-a2f0-1032219b0766"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.481986 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e774709e-e8b9-420f-a2f0-1032219b0766" (UID: "e774709e-e8b9-420f-a2f0-1032219b0766"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.541778 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.541815 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774709e-e8b9-420f-a2f0-1032219b0766-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:40 crc kubenswrapper[4725]: I0225 11:14:40.541870 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmmh4\" (UniqueName: \"kubernetes.io/projected/e774709e-e8b9-420f-a2f0-1032219b0766-kube-api-access-gmmh4\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.264368 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e774709e-e8b9-420f-a2f0-1032219b0766","Type":"ContainerDied","Data":"a5916b18d50d01a26343680515b44adf7e2901400acb0d257bdd12413c713d32"} Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.264649 4725 scope.go:117] "RemoveContainer" containerID="705c37c73d693b7b7209e8065ac9a9101a678727dab463150f97004cec33d755" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.264462 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.293754 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.307537 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.324893 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:41 crc kubenswrapper[4725]: E0225 11:14:41.325386 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e774709e-e8b9-420f-a2f0-1032219b0766" containerName="nova-scheduler-scheduler" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.325408 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e774709e-e8b9-420f-a2f0-1032219b0766" containerName="nova-scheduler-scheduler" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.325628 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e774709e-e8b9-420f-a2f0-1032219b0766" containerName="nova-scheduler-scheduler" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.326379 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.330231 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.339485 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.458596 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tst22\" (UniqueName: \"kubernetes.io/projected/f6b00346-164e-4c93-8333-5c1b47ee5ea9-kube-api-access-tst22\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.458658 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.459024 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.561683 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.561785 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tst22\" (UniqueName: \"kubernetes.io/projected/f6b00346-164e-4c93-8333-5c1b47ee5ea9-kube-api-access-tst22\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.561898 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.569396 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.570135 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.586448 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tst22\" (UniqueName: \"kubernetes.io/projected/f6b00346-164e-4c93-8333-5c1b47ee5ea9-kube-api-access-tst22\") pod \"nova-scheduler-0\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " pod="openstack/nova-scheduler-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.630015 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.630120 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:14:41 crc kubenswrapper[4725]: I0225 11:14:41.667229 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.114315 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.153662 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:14:42 crc kubenswrapper[4725]: W0225 11:14:42.159416 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6b00346_164e_4c93_8333_5c1b47ee5ea9.slice/crio-1492f6e59bb2b232ea7f64fed5420f2791172abce5aaf69ee875846e56690f59 WatchSource:0}: Error finding container 1492f6e59bb2b232ea7f64fed5420f2791172abce5aaf69ee875846e56690f59: Status 404 returned error can't find the container with id 1492f6e59bb2b232ea7f64fed5420f2791172abce5aaf69ee875846e56690f59 Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.176112 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-config-data\") pod \"a066e46e-d22c-4abf-bd26-653e283efc51\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.176278 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-combined-ca-bundle\") pod \"a066e46e-d22c-4abf-bd26-653e283efc51\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.176380 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a066e46e-d22c-4abf-bd26-653e283efc51-logs\") pod \"a066e46e-d22c-4abf-bd26-653e283efc51\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.176452 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjddt\" (UniqueName: \"kubernetes.io/projected/a066e46e-d22c-4abf-bd26-653e283efc51-kube-api-access-mjddt\") pod \"a066e46e-d22c-4abf-bd26-653e283efc51\" (UID: \"a066e46e-d22c-4abf-bd26-653e283efc51\") " Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.177741 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a066e46e-d22c-4abf-bd26-653e283efc51-logs" (OuterVolumeSpecName: "logs") pod "a066e46e-d22c-4abf-bd26-653e283efc51" (UID: "a066e46e-d22c-4abf-bd26-653e283efc51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.197991 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a066e46e-d22c-4abf-bd26-653e283efc51-kube-api-access-mjddt" (OuterVolumeSpecName: "kube-api-access-mjddt") pod "a066e46e-d22c-4abf-bd26-653e283efc51" (UID: "a066e46e-d22c-4abf-bd26-653e283efc51"). InnerVolumeSpecName "kube-api-access-mjddt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.203995 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-config-data" (OuterVolumeSpecName: "config-data") pod "a066e46e-d22c-4abf-bd26-653e283efc51" (UID: "a066e46e-d22c-4abf-bd26-653e283efc51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.216012 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a066e46e-d22c-4abf-bd26-653e283efc51" (UID: "a066e46e-d22c-4abf-bd26-653e283efc51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.280817 4725 generic.go:334] "Generic (PLEG): container finished" podID="a066e46e-d22c-4abf-bd26-653e283efc51" containerID="d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761" exitCode=0 Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.280892 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a066e46e-d22c-4abf-bd26-653e283efc51","Type":"ContainerDied","Data":"d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761"} Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.280918 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a066e46e-d22c-4abf-bd26-653e283efc51","Type":"ContainerDied","Data":"ae75bb126f16c23e9741836ec0e1a52d2cc828fb19ff57abb3416b82f0707bb5"} Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.280934 4725 scope.go:117] "RemoveContainer" containerID="d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.281067 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.283160 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjddt\" (UniqueName: \"kubernetes.io/projected/a066e46e-d22c-4abf-bd26-653e283efc51-kube-api-access-mjddt\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.283194 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.283209 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a066e46e-d22c-4abf-bd26-653e283efc51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.283221 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a066e46e-d22c-4abf-bd26-653e283efc51-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.286528 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6b00346-164e-4c93-8333-5c1b47ee5ea9","Type":"ContainerStarted","Data":"1492f6e59bb2b232ea7f64fed5420f2791172abce5aaf69ee875846e56690f59"} Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.315005 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.316793 4725 scope.go:117] "RemoveContainer" containerID="37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.323414 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.340699 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:42 crc kubenswrapper[4725]: E0225 11:14:42.341062 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-api" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.341074 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-api" Feb 25 11:14:42 crc kubenswrapper[4725]: E0225 11:14:42.341092 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-log" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.341099 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-log" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.341256 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-log" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.341284 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" containerName="nova-api-api" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.342185 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.346882 4725 scope.go:117] "RemoveContainer" containerID="d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761" Feb 25 11:14:42 crc kubenswrapper[4725]: E0225 11:14:42.347314 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761\": container with ID starting with d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761 not found: ID does not exist" containerID="d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.347345 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761"} err="failed to get container status \"d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761\": rpc error: code = NotFound desc = could not find container \"d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761\": container with ID starting with d0f1ea6e138b183ba3edc80f372c8e30191be26dc6864f7809ded5c357526761 not found: ID does not exist" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.347365 4725 scope.go:117] "RemoveContainer" containerID="37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.347768 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 11:14:42 crc kubenswrapper[4725]: E0225 11:14:42.347914 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243\": container with ID starting with 37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243 not found: ID does not exist" containerID="37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.347961 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243"} err="failed to get container status \"37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243\": rpc error: code = NotFound desc = could not find container \"37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243\": container with ID starting with 37155c408d2475196338cefbe18b68beca0d7b708e6fe625331f98a491ea4243 not found: ID does not exist" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.365028 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.385392 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.385506 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wl59\" (UniqueName: \"kubernetes.io/projected/4f7317d2-13de-4b7d-a525-37a7c45030de-kube-api-access-8wl59\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.385548 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-config-data\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.385701 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7317d2-13de-4b7d-a525-37a7c45030de-logs\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.487439 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.487519 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wl59\" (UniqueName: \"kubernetes.io/projected/4f7317d2-13de-4b7d-a525-37a7c45030de-kube-api-access-8wl59\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.487539 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-config-data\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.487620 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7317d2-13de-4b7d-a525-37a7c45030de-logs\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.488127 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7317d2-13de-4b7d-a525-37a7c45030de-logs\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.492503 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.497600 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-config-data\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.504985 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wl59\" (UniqueName: \"kubernetes.io/projected/4f7317d2-13de-4b7d-a525-37a7c45030de-kube-api-access-8wl59\") pod \"nova-api-0\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " pod="openstack/nova-api-0" Feb 25 11:14:42 crc kubenswrapper[4725]: I0225 11:14:42.676017 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:14:43 crc kubenswrapper[4725]: I0225 11:14:43.157107 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:14:43 crc kubenswrapper[4725]: W0225 11:14:43.162142 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f7317d2_13de_4b7d_a525_37a7c45030de.slice/crio-1bd3ebbe0ca12e859610fad542db3c0b091d4487b4019d980eecc784dc7ddb10 WatchSource:0}: Error finding container 1bd3ebbe0ca12e859610fad542db3c0b091d4487b4019d980eecc784dc7ddb10: Status 404 returned error can't find the container with id 1bd3ebbe0ca12e859610fad542db3c0b091d4487b4019d980eecc784dc7ddb10 Feb 25 11:14:43 crc kubenswrapper[4725]: I0225 11:14:43.239701 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a066e46e-d22c-4abf-bd26-653e283efc51" path="/var/lib/kubelet/pods/a066e46e-d22c-4abf-bd26-653e283efc51/volumes" Feb 25 11:14:43 crc kubenswrapper[4725]: I0225 11:14:43.241431 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e774709e-e8b9-420f-a2f0-1032219b0766" path="/var/lib/kubelet/pods/e774709e-e8b9-420f-a2f0-1032219b0766/volumes" Feb 25 11:14:43 crc kubenswrapper[4725]: I0225 11:14:43.296859 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6b00346-164e-4c93-8333-5c1b47ee5ea9","Type":"ContainerStarted","Data":"f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f"} Feb 25 11:14:43 crc kubenswrapper[4725]: I0225 11:14:43.300766 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f7317d2-13de-4b7d-a525-37a7c45030de","Type":"ContainerStarted","Data":"1bd3ebbe0ca12e859610fad542db3c0b091d4487b4019d980eecc784dc7ddb10"} Feb 25 11:14:43 crc kubenswrapper[4725]: I0225 11:14:43.317870 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.317850103 podStartE2EDuration="2.317850103s" podCreationTimestamp="2026-02-25 11:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:43.312459378 +0000 UTC m=+1308.811041403" watchObservedRunningTime="2026-02-25 11:14:43.317850103 +0000 UTC m=+1308.816432138" Feb 25 11:14:44 crc kubenswrapper[4725]: I0225 11:14:44.312338 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f7317d2-13de-4b7d-a525-37a7c45030de","Type":"ContainerStarted","Data":"2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af"} Feb 25 11:14:44 crc kubenswrapper[4725]: I0225 11:14:44.312698 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f7317d2-13de-4b7d-a525-37a7c45030de","Type":"ContainerStarted","Data":"b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34"} Feb 25 11:14:44 crc kubenswrapper[4725]: I0225 11:14:44.341537 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.341502614 podStartE2EDuration="2.341502614s" podCreationTimestamp="2026-02-25 11:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:14:44.336842999 +0000 UTC m=+1309.835425044" watchObservedRunningTime="2026-02-25 11:14:44.341502614 +0000 UTC m=+1309.840084639" Feb 25 11:14:46 crc kubenswrapper[4725]: I0225 11:14:46.628992 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 25 11:14:46 crc kubenswrapper[4725]: I0225 11:14:46.630302 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 11:14:46 crc kubenswrapper[4725]: I0225 11:14:46.630368 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 11:14:46 crc kubenswrapper[4725]: I0225 11:14:46.668106 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 11:14:47 crc kubenswrapper[4725]: I0225 11:14:47.644950 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 11:14:47 crc kubenswrapper[4725]: I0225 11:14:47.644976 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 11:14:51 crc kubenswrapper[4725]: I0225 11:14:51.668420 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 11:14:51 crc kubenswrapper[4725]: I0225 11:14:51.700804 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 11:14:52 crc kubenswrapper[4725]: I0225 11:14:52.457145 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 11:14:52 crc kubenswrapper[4725]: I0225 11:14:52.676727 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 11:14:52 crc kubenswrapper[4725]: I0225 11:14:52.676809 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 11:14:53 crc kubenswrapper[4725]: I0225 11:14:53.759087 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 11:14:53 crc kubenswrapper[4725]: I0225 11:14:53.759080 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 11:14:56 crc kubenswrapper[4725]: I0225 11:14:56.641301 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 11:14:56 crc kubenswrapper[4725]: I0225 11:14:56.642155 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 11:14:56 crc kubenswrapper[4725]: I0225 11:14:56.653070 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 11:14:56 crc kubenswrapper[4725]: I0225 11:14:56.653546 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.406598 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.499793 4725 generic.go:334] "Generic (PLEG): container finished" podID="f908cbdf-92d0-4356-8139-2919a723a457" containerID="4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8" exitCode=137 Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.499857 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.499878 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f908cbdf-92d0-4356-8139-2919a723a457","Type":"ContainerDied","Data":"4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8"} Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.499916 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f908cbdf-92d0-4356-8139-2919a723a457","Type":"ContainerDied","Data":"9a2cf3a7ae2630ec4cfd70193903c0c4610c8831f114bf82bd51072311dd7565"} Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.499933 4725 scope.go:117] "RemoveContainer" containerID="4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.521429 4725 scope.go:117] "RemoveContainer" containerID="4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8" Feb 25 11:14:59 crc kubenswrapper[4725]: E0225 11:14:59.522159 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8\": container with ID starting with 4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8 not found: ID does not exist" containerID="4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.522250 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8"} err="failed to get container status \"4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8\": rpc error: code = NotFound desc = could not find container \"4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8\": container with ID starting with 4b92a5f28e57cc38dd7a594060ad275223c1a45a9f918cc5d4e7384e01f6b6f8 not found: ID does not exist" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.542084 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-config-data\") pod \"f908cbdf-92d0-4356-8139-2919a723a457\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.542227 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbqj2\" (UniqueName: \"kubernetes.io/projected/f908cbdf-92d0-4356-8139-2919a723a457-kube-api-access-dbqj2\") pod \"f908cbdf-92d0-4356-8139-2919a723a457\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.542544 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-combined-ca-bundle\") pod \"f908cbdf-92d0-4356-8139-2919a723a457\" (UID: \"f908cbdf-92d0-4356-8139-2919a723a457\") " Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.547399 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f908cbdf-92d0-4356-8139-2919a723a457-kube-api-access-dbqj2" (OuterVolumeSpecName: "kube-api-access-dbqj2") pod "f908cbdf-92d0-4356-8139-2919a723a457" (UID: "f908cbdf-92d0-4356-8139-2919a723a457"). InnerVolumeSpecName "kube-api-access-dbqj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.573240 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-config-data" (OuterVolumeSpecName: "config-data") pod "f908cbdf-92d0-4356-8139-2919a723a457" (UID: "f908cbdf-92d0-4356-8139-2919a723a457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.574903 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f908cbdf-92d0-4356-8139-2919a723a457" (UID: "f908cbdf-92d0-4356-8139-2919a723a457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.646092 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.646136 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbqj2\" (UniqueName: \"kubernetes.io/projected/f908cbdf-92d0-4356-8139-2919a723a457-kube-api-access-dbqj2\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.646156 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f908cbdf-92d0-4356-8139-2919a723a457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.856285 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.866053 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.878613 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:14:59 crc kubenswrapper[4725]: E0225 11:14:59.879163 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f908cbdf-92d0-4356-8139-2919a723a457" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.879189 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f908cbdf-92d0-4356-8139-2919a723a457" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.879400 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f908cbdf-92d0-4356-8139-2919a723a457" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.880112 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.882763 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.882798 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.883220 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 25 11:14:59 crc kubenswrapper[4725]: I0225 11:14:59.895146 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.053437 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.054077 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.054135 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7vbf\" (UniqueName: \"kubernetes.io/projected/53df7811-b191-4c54-b2c4-5faed23e2cc3-kube-api-access-t7vbf\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.054240 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.054283 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.157004 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.157471 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.157785 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.158380 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.159561 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7vbf\" (UniqueName: \"kubernetes.io/projected/53df7811-b191-4c54-b2c4-5faed23e2cc3-kube-api-access-t7vbf\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.165800 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.166982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.167824 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.169602 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/53df7811-b191-4c54-b2c4-5faed23e2cc3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.170194 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz"] Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.172494 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.180781 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.181986 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.194089 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz"] Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.198473 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7vbf\" (UniqueName: \"kubernetes.io/projected/53df7811-b191-4c54-b2c4-5faed23e2cc3-kube-api-access-t7vbf\") pod \"nova-cell1-novncproxy-0\" (UID: \"53df7811-b191-4c54-b2c4-5faed23e2cc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.208787 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.364730 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxbj\" (UniqueName: \"kubernetes.io/projected/2de1f14b-c4f4-4751-8fc6-8d4336738638-kube-api-access-8wxbj\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.365227 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2de1f14b-c4f4-4751-8fc6-8d4336738638-config-volume\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.365397 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2de1f14b-c4f4-4751-8fc6-8d4336738638-secret-volume\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.467500 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2de1f14b-c4f4-4751-8fc6-8d4336738638-secret-volume\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.467655 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxbj\" (UniqueName: \"kubernetes.io/projected/2de1f14b-c4f4-4751-8fc6-8d4336738638-kube-api-access-8wxbj\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.467702 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2de1f14b-c4f4-4751-8fc6-8d4336738638-config-volume\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.469039 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2de1f14b-c4f4-4751-8fc6-8d4336738638-config-volume\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.479977 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2de1f14b-c4f4-4751-8fc6-8d4336738638-secret-volume\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.502057 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxbj\" (UniqueName: \"kubernetes.io/projected/2de1f14b-c4f4-4751-8fc6-8d4336738638-kube-api-access-8wxbj\") pod \"collect-profiles-29533635-5jjnz\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.514731 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 11:15:00 crc kubenswrapper[4725]: I0225 11:15:00.624731 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.094471 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz"] Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.240974 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f908cbdf-92d0-4356-8139-2919a723a457" path="/var/lib/kubelet/pods/f908cbdf-92d0-4356-8139-2919a723a457/volumes" Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.524107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53df7811-b191-4c54-b2c4-5faed23e2cc3","Type":"ContainerStarted","Data":"1136f137751702cc75f8bcc28eb08b2d255cf8b5ae8d8704b5a791ded784d3a2"} Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.524174 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53df7811-b191-4c54-b2c4-5faed23e2cc3","Type":"ContainerStarted","Data":"6a6f69cfb790e165cda54b13291010708432834b19c8e738d51938e1d8843a0d"} Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.525630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" event={"ID":"2de1f14b-c4f4-4751-8fc6-8d4336738638","Type":"ContainerStarted","Data":"e29609f6451245cb476005c03ddd27d75a8bedf9eabc04e230fd966e7a1f9e12"} Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.525665 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" event={"ID":"2de1f14b-c4f4-4751-8fc6-8d4336738638","Type":"ContainerStarted","Data":"690078c4fc528a96b925e2766b2a98cf246ea203d52f1d971e0fdc670e0bdd7c"} Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.553233 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.553212893 podStartE2EDuration="2.553212893s" podCreationTimestamp="2026-02-25 11:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:01.552132564 +0000 UTC m=+1327.050714609" watchObservedRunningTime="2026-02-25 11:15:01.553212893 +0000 UTC m=+1327.051794918" Feb 25 11:15:01 crc kubenswrapper[4725]: I0225 11:15:01.573241 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" podStartSLOduration=1.57322074 podStartE2EDuration="1.57322074s" podCreationTimestamp="2026-02-25 11:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:01.572729437 +0000 UTC m=+1327.071311472" watchObservedRunningTime="2026-02-25 11:15:01.57322074 +0000 UTC m=+1327.071802765" Feb 25 11:15:02 crc kubenswrapper[4725]: I0225 11:15:02.541553 4725 generic.go:334] "Generic (PLEG): container finished" podID="2de1f14b-c4f4-4751-8fc6-8d4336738638" containerID="e29609f6451245cb476005c03ddd27d75a8bedf9eabc04e230fd966e7a1f9e12" exitCode=0 Feb 25 11:15:02 crc kubenswrapper[4725]: I0225 11:15:02.542069 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" event={"ID":"2de1f14b-c4f4-4751-8fc6-8d4336738638","Type":"ContainerDied","Data":"e29609f6451245cb476005c03ddd27d75a8bedf9eabc04e230fd966e7a1f9e12"} Feb 25 11:15:02 crc kubenswrapper[4725]: I0225 11:15:02.681322 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 11:15:02 crc kubenswrapper[4725]: I0225 11:15:02.682051 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 11:15:02 crc kubenswrapper[4725]: I0225 11:15:02.684903 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 11:15:02 crc kubenswrapper[4725]: I0225 11:15:02.686941 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.553268 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.558946 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.785201 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xmk5s"] Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.786630 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.815593 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xmk5s"] Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.847379 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.847433 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.847466 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-config\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.847487 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.847540 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2h7b\" (UniqueName: \"kubernetes.io/projected/ed310acc-141b-4704-85b7-cc6761c13c0a-kube-api-access-q2h7b\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.847646 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.952024 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.952118 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.952150 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.952180 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-config\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.952202 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.952251 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2h7b\" (UniqueName: \"kubernetes.io/projected/ed310acc-141b-4704-85b7-cc6761c13c0a-kube-api-access-q2h7b\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.953364 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.953504 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.953663 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-config\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.953933 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.954302 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:03 crc kubenswrapper[4725]: I0225 11:15:03.988648 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2h7b\" (UniqueName: \"kubernetes.io/projected/ed310acc-141b-4704-85b7-cc6761c13c0a-kube-api-access-q2h7b\") pod \"dnsmasq-dns-89c5cd4d5-xmk5s\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.116482 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.136327 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.160235 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wxbj\" (UniqueName: \"kubernetes.io/projected/2de1f14b-c4f4-4751-8fc6-8d4336738638-kube-api-access-8wxbj\") pod \"2de1f14b-c4f4-4751-8fc6-8d4336738638\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.160278 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2de1f14b-c4f4-4751-8fc6-8d4336738638-secret-volume\") pod \"2de1f14b-c4f4-4751-8fc6-8d4336738638\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.160649 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2de1f14b-c4f4-4751-8fc6-8d4336738638-config-volume\") pod \"2de1f14b-c4f4-4751-8fc6-8d4336738638\" (UID: \"2de1f14b-c4f4-4751-8fc6-8d4336738638\") " Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.164781 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de1f14b-c4f4-4751-8fc6-8d4336738638-config-volume" (OuterVolumeSpecName: "config-volume") pod "2de1f14b-c4f4-4751-8fc6-8d4336738638" (UID: "2de1f14b-c4f4-4751-8fc6-8d4336738638"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.165017 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2de1f14b-c4f4-4751-8fc6-8d4336738638-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.168100 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de1f14b-c4f4-4751-8fc6-8d4336738638-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2de1f14b-c4f4-4751-8fc6-8d4336738638" (UID: "2de1f14b-c4f4-4751-8fc6-8d4336738638"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.168899 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de1f14b-c4f4-4751-8fc6-8d4336738638-kube-api-access-8wxbj" (OuterVolumeSpecName: "kube-api-access-8wxbj") pod "2de1f14b-c4f4-4751-8fc6-8d4336738638" (UID: "2de1f14b-c4f4-4751-8fc6-8d4336738638"). InnerVolumeSpecName "kube-api-access-8wxbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.274388 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wxbj\" (UniqueName: \"kubernetes.io/projected/2de1f14b-c4f4-4751-8fc6-8d4336738638-kube-api-access-8wxbj\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.274426 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2de1f14b-c4f4-4751-8fc6-8d4336738638-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.563018 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.568068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz" event={"ID":"2de1f14b-c4f4-4751-8fc6-8d4336738638","Type":"ContainerDied","Data":"690078c4fc528a96b925e2766b2a98cf246ea203d52f1d971e0fdc670e0bdd7c"} Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.568187 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690078c4fc528a96b925e2766b2a98cf246ea203d52f1d971e0fdc670e0bdd7c" Feb 25 11:15:04 crc kubenswrapper[4725]: I0225 11:15:04.584517 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xmk5s"] Feb 25 11:15:04 crc kubenswrapper[4725]: E0225 11:15:04.905755 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded310acc_141b_4704_85b7_cc6761c13c0a.slice/crio-conmon-c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72.scope\": RecentStats: unable to find data in memory cache]" Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.210590 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.574735 4725 generic.go:334] "Generic (PLEG): container finished" podID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerID="c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72" exitCode=0 Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.574804 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" event={"ID":"ed310acc-141b-4704-85b7-cc6761c13c0a","Type":"ContainerDied","Data":"c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72"} Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.574884 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" event={"ID":"ed310acc-141b-4704-85b7-cc6761c13c0a","Type":"ContainerStarted","Data":"185f245afcb9e06bc93745de0f7704bdbe1bba118a4691f9ae8622ef54a6a87c"} Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.946053 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.946494 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-central-agent" containerID="cri-o://12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99" gracePeriod=30 Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.946583 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="proxy-httpd" containerID="cri-o://9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb" gracePeriod=30 Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.946612 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="sg-core" containerID="cri-o://0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23" gracePeriod=30 Feb 25 11:15:05 crc kubenswrapper[4725]: I0225 11:15:05.946641 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-notification-agent" containerID="cri-o://ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a" gracePeriod=30 Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.233140 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.590867 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" event={"ID":"ed310acc-141b-4704-85b7-cc6761c13c0a","Type":"ContainerStarted","Data":"b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd"} Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.591006 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595253 4725 generic.go:334] "Generic (PLEG): container finished" podID="040deb18-257b-4642-8df3-2d7da1389ce6" containerID="9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb" exitCode=0 Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595282 4725 generic.go:334] "Generic (PLEG): container finished" podID="040deb18-257b-4642-8df3-2d7da1389ce6" containerID="0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23" exitCode=2 Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595289 4725 generic.go:334] "Generic (PLEG): container finished" podID="040deb18-257b-4642-8df3-2d7da1389ce6" containerID="12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99" exitCode=0 Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595308 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerDied","Data":"9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb"} Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595349 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerDied","Data":"0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23"} Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595359 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerDied","Data":"12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99"} Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595451 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-log" containerID="cri-o://b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34" gracePeriod=30 Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.595502 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-api" containerID="cri-o://2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af" gracePeriod=30 Feb 25 11:15:06 crc kubenswrapper[4725]: I0225 11:15:06.622381 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" podStartSLOduration=3.622365063 podStartE2EDuration="3.622365063s" podCreationTimestamp="2026-02-25 11:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:06.616584608 +0000 UTC m=+1332.115166633" watchObservedRunningTime="2026-02-25 11:15:06.622365063 +0000 UTC m=+1332.120947088" Feb 25 11:15:07 crc kubenswrapper[4725]: I0225 11:15:07.608762 4725 generic.go:334] "Generic (PLEG): container finished" podID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerID="b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34" exitCode=143 Feb 25 11:15:07 crc kubenswrapper[4725]: I0225 11:15:07.609480 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f7317d2-13de-4b7d-a525-37a7c45030de","Type":"ContainerDied","Data":"b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34"} Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.213037 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.251973 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.313688 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.318779 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411040 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7317d2-13de-4b7d-a525-37a7c45030de-logs\") pod \"4f7317d2-13de-4b7d-a525-37a7c45030de\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411464 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-config-data\") pod \"4f7317d2-13de-4b7d-a525-37a7c45030de\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411503 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f7317d2-13de-4b7d-a525-37a7c45030de-logs" (OuterVolumeSpecName: "logs") pod "4f7317d2-13de-4b7d-a525-37a7c45030de" (UID: "4f7317d2-13de-4b7d-a525-37a7c45030de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411522 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-run-httpd\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411587 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-798nd\" (UniqueName: \"kubernetes.io/projected/040deb18-257b-4642-8df3-2d7da1389ce6-kube-api-access-798nd\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411631 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-config-data\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411669 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411735 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wl59\" (UniqueName: \"kubernetes.io/projected/4f7317d2-13de-4b7d-a525-37a7c45030de-kube-api-access-8wl59\") pod \"4f7317d2-13de-4b7d-a525-37a7c45030de\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411768 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-scripts\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411816 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-combined-ca-bundle\") pod \"4f7317d2-13de-4b7d-a525-37a7c45030de\" (UID: \"4f7317d2-13de-4b7d-a525-37a7c45030de\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411890 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-ceilometer-tls-certs\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411945 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-log-httpd\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411998 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-sg-core-conf-yaml\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.411891 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.412633 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7317d2-13de-4b7d-a525-37a7c45030de-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.412658 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.412804 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.417089 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-scripts" (OuterVolumeSpecName: "scripts") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.418449 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7317d2-13de-4b7d-a525-37a7c45030de-kube-api-access-8wl59" (OuterVolumeSpecName: "kube-api-access-8wl59") pod "4f7317d2-13de-4b7d-a525-37a7c45030de" (UID: "4f7317d2-13de-4b7d-a525-37a7c45030de"). InnerVolumeSpecName "kube-api-access-8wl59". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.419306 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040deb18-257b-4642-8df3-2d7da1389ce6-kube-api-access-798nd" (OuterVolumeSpecName: "kube-api-access-798nd") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "kube-api-access-798nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.445209 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-config-data" (OuterVolumeSpecName: "config-data") pod "4f7317d2-13de-4b7d-a525-37a7c45030de" (UID: "4f7317d2-13de-4b7d-a525-37a7c45030de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.457724 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f7317d2-13de-4b7d-a525-37a7c45030de" (UID: "4f7317d2-13de-4b7d-a525-37a7c45030de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.462769 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.485128 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514767 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514796 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-798nd\" (UniqueName: \"kubernetes.io/projected/040deb18-257b-4642-8df3-2d7da1389ce6-kube-api-access-798nd\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514806 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wl59\" (UniqueName: \"kubernetes.io/projected/4f7317d2-13de-4b7d-a525-37a7c45030de-kube-api-access-8wl59\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514817 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514837 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7317d2-13de-4b7d-a525-37a7c45030de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514845 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514854 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/040deb18-257b-4642-8df3-2d7da1389ce6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.514861 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.541210 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle podName:040deb18-257b-4642-8df3-2d7da1389ce6 nodeName:}" failed. No retries permitted until 2026-02-25 11:15:11.0411848 +0000 UTC m=+1336.539766825 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6") : error deleting /var/lib/kubelet/pods/040deb18-257b-4642-8df3-2d7da1389ce6/volume-subpaths: remove /var/lib/kubelet/pods/040deb18-257b-4642-8df3-2d7da1389ce6/volume-subpaths: no such file or directory Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.543700 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-config-data" (OuterVolumeSpecName: "config-data") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.616316 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.643589 4725 generic.go:334] "Generic (PLEG): container finished" podID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerID="2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af" exitCode=0 Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.643631 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.643652 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f7317d2-13de-4b7d-a525-37a7c45030de","Type":"ContainerDied","Data":"2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af"} Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.643678 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f7317d2-13de-4b7d-a525-37a7c45030de","Type":"ContainerDied","Data":"1bd3ebbe0ca12e859610fad542db3c0b091d4487b4019d980eecc784dc7ddb10"} Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.643694 4725 scope.go:117] "RemoveContainer" containerID="2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.648939 4725 generic.go:334] "Generic (PLEG): container finished" podID="040deb18-257b-4642-8df3-2d7da1389ce6" containerID="ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a" exitCode=0 Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.650406 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.651064 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerDied","Data":"ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a"} Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.651102 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"040deb18-257b-4642-8df3-2d7da1389ce6","Type":"ContainerDied","Data":"74561fbedfe6cac9bdae45010488bd68c5945e07dab86c6aed334f67efdccb4d"} Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.667149 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.668996 4725 scope.go:117] "RemoveContainer" containerID="b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.686522 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.693903 4725 scope.go:117] "RemoveContainer" containerID="2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.694320 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af\": container with ID starting with 2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af not found: ID does not exist" containerID="2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.694372 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af"} err="failed to get container status \"2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af\": rpc error: code = NotFound desc = could not find container \"2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af\": container with ID starting with 2f28c0525ec32282f9b18a4433949e752f348652d57c2d2a0e7bf0c074f736af not found: ID does not exist" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.694400 4725 scope.go:117] "RemoveContainer" containerID="b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.694838 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34\": container with ID starting with b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34 not found: ID does not exist" containerID="b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.694874 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34"} err="failed to get container status \"b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34\": rpc error: code = NotFound desc = could not find container \"b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34\": container with ID starting with b08a77874a1c543a5fe8943beeeb1dacc986efc7b43e4c8e286e5fe482ae3c34 not found: ID does not exist" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.694899 4725 scope.go:117] "RemoveContainer" containerID="9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.702941 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.750976 4725 scope.go:117] "RemoveContainer" containerID="0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.764764 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.765394 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de1f14b-c4f4-4751-8fc6-8d4336738638" containerName="collect-profiles" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.765473 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de1f14b-c4f4-4751-8fc6-8d4336738638" containerName="collect-profiles" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.765539 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-notification-agent" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.765600 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-notification-agent" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.765667 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="proxy-httpd" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.765718 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="proxy-httpd" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.765774 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-central-agent" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.765839 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-central-agent" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.765914 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-api" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.765966 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-api" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.766025 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="sg-core" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766074 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="sg-core" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.766127 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-log" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766175 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-log" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766387 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de1f14b-c4f4-4751-8fc6-8d4336738638" containerName="collect-profiles" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766451 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="proxy-httpd" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766517 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-central-agent" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766577 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="sg-core" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766627 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-api" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766709 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" containerName="ceilometer-notification-agent" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.766767 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" containerName="nova-api-log" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.767720 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.772503 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.797986 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.781647 4725 scope.go:117] "RemoveContainer" containerID="ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.790641 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.798128 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.819974 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfx4\" (UniqueName: \"kubernetes.io/projected/ebaf12e6-82fc-4885-9428-b186379c6009-kube-api-access-rnfx4\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.820036 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-public-tls-certs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.820100 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-config-data\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.820119 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.820157 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaf12e6-82fc-4885-9428-b186379c6009-logs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.820192 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.897978 4725 scope.go:117] "RemoveContainer" containerID="12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.921724 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaf12e6-82fc-4885-9428-b186379c6009-logs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.921788 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.921860 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfx4\" (UniqueName: \"kubernetes.io/projected/ebaf12e6-82fc-4885-9428-b186379c6009-kube-api-access-rnfx4\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.921890 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-public-tls-certs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.921948 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-config-data\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.921965 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.922668 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaf12e6-82fc-4885-9428-b186379c6009-logs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.927256 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.927581 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-config-data\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.931837 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.934170 4725 scope.go:117] "RemoveContainer" containerID="9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.934706 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb\": container with ID starting with 9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb not found: ID does not exist" containerID="9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.934748 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb"} err="failed to get container status \"9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb\": rpc error: code = NotFound desc = could not find container \"9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb\": container with ID starting with 9dc7e5ed6ecca8a1f41ee6415dc70dfb16ce3da98ed5cbab9085bc7415d583bb not found: ID does not exist" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.934779 4725 scope.go:117] "RemoveContainer" containerID="0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.935238 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23\": container with ID starting with 0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23 not found: ID does not exist" containerID="0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.935351 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23"} err="failed to get container status \"0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23\": rpc error: code = NotFound desc = could not find container \"0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23\": container with ID starting with 0a1ef69aa97fa20533b72f53614369f515213be98b8c4ffcc627e727e7bf9d23 not found: ID does not exist" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.935443 4725 scope.go:117] "RemoveContainer" containerID="ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.935872 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a\": container with ID starting with ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a not found: ID does not exist" containerID="ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.935898 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a"} err="failed to get container status \"ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a\": rpc error: code = NotFound desc = could not find container \"ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a\": container with ID starting with ed417197643556aeb70965b957b18151b8e77b703afd21e42f9c360f22c1023a not found: ID does not exist" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.935915 4725 scope.go:117] "RemoveContainer" containerID="12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99" Feb 25 11:15:10 crc kubenswrapper[4725]: E0225 11:15:10.937411 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99\": container with ID starting with 12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99 not found: ID does not exist" containerID="12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.937509 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99"} err="failed to get container status \"12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99\": rpc error: code = NotFound desc = could not find container \"12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99\": container with ID starting with 12a14b07ff630cebf69b0b87686755b6aaa82427dd5c4db7c85b3a3a3b2b1d99 not found: ID does not exist" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.946783 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfx4\" (UniqueName: \"kubernetes.io/projected/ebaf12e6-82fc-4885-9428-b186379c6009-kube-api-access-rnfx4\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.949568 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-public-tls-certs\") pod \"nova-api-0\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " pod="openstack/nova-api-0" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.951207 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-v6jpc"] Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.952462 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.964755 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6jpc"] Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.978783 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 25 11:15:10 crc kubenswrapper[4725]: I0225 11:15:10.979398 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.023756 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-config-data\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.023963 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkxdt\" (UniqueName: \"kubernetes.io/projected/3538d74b-8967-41d6-b4a4-add6bf1558ad-kube-api-access-vkxdt\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.024071 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-scripts\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.024324 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.092948 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.126246 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle\") pod \"040deb18-257b-4642-8df3-2d7da1389ce6\" (UID: \"040deb18-257b-4642-8df3-2d7da1389ce6\") " Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.126563 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-config-data\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.126626 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkxdt\" (UniqueName: \"kubernetes.io/projected/3538d74b-8967-41d6-b4a4-add6bf1558ad-kube-api-access-vkxdt\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.126673 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-scripts\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.126751 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.132538 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-config-data\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.133205 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "040deb18-257b-4642-8df3-2d7da1389ce6" (UID: "040deb18-257b-4642-8df3-2d7da1389ce6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.135551 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.136650 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-scripts\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.143097 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkxdt\" (UniqueName: \"kubernetes.io/projected/3538d74b-8967-41d6-b4a4-add6bf1558ad-kube-api-access-vkxdt\") pod \"nova-cell1-cell-mapping-v6jpc\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.228182 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/040deb18-257b-4642-8df3-2d7da1389ce6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.234048 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f7317d2-13de-4b7d-a525-37a7c45030de" path="/var/lib/kubelet/pods/4f7317d2-13de-4b7d-a525-37a7c45030de/volumes" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.275850 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.289906 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.300778 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.303808 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.307554 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.308860 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.309119 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.316800 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.320158 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.433514 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-config-data\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.433848 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b92e78-7b23-469e-9220-9ea38d9cba32-log-httpd\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.433901 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.434066 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.434110 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdkr7\" (UniqueName: \"kubernetes.io/projected/e2b92e78-7b23-469e-9220-9ea38d9cba32-kube-api-access-cdkr7\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.434139 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-scripts\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.434174 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b92e78-7b23-469e-9220-9ea38d9cba32-run-httpd\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.434207 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536087 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-config-data\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536132 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b92e78-7b23-469e-9220-9ea38d9cba32-log-httpd\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536173 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536241 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536266 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdkr7\" (UniqueName: \"kubernetes.io/projected/e2b92e78-7b23-469e-9220-9ea38d9cba32-kube-api-access-cdkr7\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536286 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-scripts\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536310 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b92e78-7b23-469e-9220-9ea38d9cba32-run-httpd\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.536363 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.541643 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.542197 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.545326 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b92e78-7b23-469e-9220-9ea38d9cba32-log-httpd\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.545917 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b92e78-7b23-469e-9220-9ea38d9cba32-run-httpd\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.546073 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-config-data\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.546500 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-scripts\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.546706 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b92e78-7b23-469e-9220-9ea38d9cba32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.565614 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdkr7\" (UniqueName: \"kubernetes.io/projected/e2b92e78-7b23-469e-9220-9ea38d9cba32-kube-api-access-cdkr7\") pod \"ceilometer-0\" (UID: \"e2b92e78-7b23-469e-9220-9ea38d9cba32\") " pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.592749 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.635396 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.733170 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebaf12e6-82fc-4885-9428-b186379c6009","Type":"ContainerStarted","Data":"63420fad53cddcd5d6727f3963e87b06658902ff32a53932ca45fa8df76525ef"} Feb 25 11:15:11 crc kubenswrapper[4725]: I0225 11:15:11.794709 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6jpc"] Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.142015 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 11:15:12 crc kubenswrapper[4725]: W0225 11:15:12.147351 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b92e78_7b23_469e_9220_9ea38d9cba32.slice/crio-1cf594914a00d2e07750dce5e335c9d5d652e09c83901a64de51fd3e046dc474 WatchSource:0}: Error finding container 1cf594914a00d2e07750dce5e335c9d5d652e09c83901a64de51fd3e046dc474: Status 404 returned error can't find the container with id 1cf594914a00d2e07750dce5e335c9d5d652e09c83901a64de51fd3e046dc474 Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.744326 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2b92e78-7b23-469e-9220-9ea38d9cba32","Type":"ContainerStarted","Data":"1cf594914a00d2e07750dce5e335c9d5d652e09c83901a64de51fd3e046dc474"} Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.749072 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6jpc" event={"ID":"3538d74b-8967-41d6-b4a4-add6bf1558ad","Type":"ContainerStarted","Data":"ce808751965ad771f7787ed4ef2630ae08dddd816d343a30dd7d728b22f88733"} Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.749105 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6jpc" event={"ID":"3538d74b-8967-41d6-b4a4-add6bf1558ad","Type":"ContainerStarted","Data":"abdad035e75dc3b1ceae60daeefa5a75be428337ac63b6f60241dbf3194b5fb1"} Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.754259 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebaf12e6-82fc-4885-9428-b186379c6009","Type":"ContainerStarted","Data":"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95"} Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.754324 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebaf12e6-82fc-4885-9428-b186379c6009","Type":"ContainerStarted","Data":"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe"} Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.778726 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-v6jpc" podStartSLOduration=2.77869886 podStartE2EDuration="2.77869886s" podCreationTimestamp="2026-02-25 11:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:12.771492077 +0000 UTC m=+1338.270074132" watchObservedRunningTime="2026-02-25 11:15:12.77869886 +0000 UTC m=+1338.277280925" Feb 25 11:15:12 crc kubenswrapper[4725]: I0225 11:15:12.802299 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.802278973 podStartE2EDuration="2.802278973s" podCreationTimestamp="2026-02-25 11:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:12.788684748 +0000 UTC m=+1338.287266803" watchObservedRunningTime="2026-02-25 11:15:12.802278973 +0000 UTC m=+1338.300861008" Feb 25 11:15:13 crc kubenswrapper[4725]: I0225 11:15:13.240845 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="040deb18-257b-4642-8df3-2d7da1389ce6" path="/var/lib/kubelet/pods/040deb18-257b-4642-8df3-2d7da1389ce6/volumes" Feb 25 11:15:13 crc kubenswrapper[4725]: I0225 11:15:13.769034 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2b92e78-7b23-469e-9220-9ea38d9cba32","Type":"ContainerStarted","Data":"d763f4f0fca5270943a7ea3438820ae6eb2702b36817d20c41d0eda67e007600"} Feb 25 11:15:13 crc kubenswrapper[4725]: I0225 11:15:13.769320 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2b92e78-7b23-469e-9220-9ea38d9cba32","Type":"ContainerStarted","Data":"0a2b2ffc58c711398dac44617bb4a676ca73ceffe9656290dc0fd05eec4dc2e8"} Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.119027 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.197748 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-h2f8n"] Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.198226 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerName="dnsmasq-dns" containerID="cri-o://9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c" gracePeriod=10 Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.693298 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.778981 4725 generic.go:334] "Generic (PLEG): container finished" podID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerID="9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c" exitCode=0 Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.779032 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.779061 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" event={"ID":"b356c2f5-ae04-4c30-932f-b0919fa9340c","Type":"ContainerDied","Data":"9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c"} Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.779111 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" event={"ID":"b356c2f5-ae04-4c30-932f-b0919fa9340c","Type":"ContainerDied","Data":"4a48797deb687f499501922852a52ab7a867f709395167a6778fc37da01d9eba"} Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.779130 4725 scope.go:117] "RemoveContainer" containerID="9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.785171 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2b92e78-7b23-469e-9220-9ea38d9cba32","Type":"ContainerStarted","Data":"de802502624f0e74917b24ef22c97fe8c0d8e8831dae1d3c9edbfd83efe38d78"} Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.797747 4725 scope.go:117] "RemoveContainer" containerID="65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.812216 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-svc\") pod \"b356c2f5-ae04-4c30-932f-b0919fa9340c\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.812364 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-sb\") pod \"b356c2f5-ae04-4c30-932f-b0919fa9340c\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.812411 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-swift-storage-0\") pod \"b356c2f5-ae04-4c30-932f-b0919fa9340c\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.812483 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-nb\") pod \"b356c2f5-ae04-4c30-932f-b0919fa9340c\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.812536 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z69hq\" (UniqueName: \"kubernetes.io/projected/b356c2f5-ae04-4c30-932f-b0919fa9340c-kube-api-access-z69hq\") pod \"b356c2f5-ae04-4c30-932f-b0919fa9340c\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.812558 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-config\") pod \"b356c2f5-ae04-4c30-932f-b0919fa9340c\" (UID: \"b356c2f5-ae04-4c30-932f-b0919fa9340c\") " Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.829147 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b356c2f5-ae04-4c30-932f-b0919fa9340c-kube-api-access-z69hq" (OuterVolumeSpecName: "kube-api-access-z69hq") pod "b356c2f5-ae04-4c30-932f-b0919fa9340c" (UID: "b356c2f5-ae04-4c30-932f-b0919fa9340c"). InnerVolumeSpecName "kube-api-access-z69hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.831760 4725 scope.go:117] "RemoveContainer" containerID="9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c" Feb 25 11:15:14 crc kubenswrapper[4725]: E0225 11:15:14.832973 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c\": container with ID starting with 9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c not found: ID does not exist" containerID="9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.833000 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c"} err="failed to get container status \"9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c\": rpc error: code = NotFound desc = could not find container \"9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c\": container with ID starting with 9076505f6cd9ba3a63929c533482351fcde517abd4241768238f26a82dad846c not found: ID does not exist" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.833020 4725 scope.go:117] "RemoveContainer" containerID="65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b" Feb 25 11:15:14 crc kubenswrapper[4725]: E0225 11:15:14.833494 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b\": container with ID starting with 65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b not found: ID does not exist" containerID="65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.833522 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b"} err="failed to get container status \"65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b\": rpc error: code = NotFound desc = could not find container \"65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b\": container with ID starting with 65160be9ab4a0602106e5672cea2624324cfb4d7534e69d8345eb9313a03e22b not found: ID does not exist" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.862396 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-config" (OuterVolumeSpecName: "config") pod "b356c2f5-ae04-4c30-932f-b0919fa9340c" (UID: "b356c2f5-ae04-4c30-932f-b0919fa9340c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.870312 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b356c2f5-ae04-4c30-932f-b0919fa9340c" (UID: "b356c2f5-ae04-4c30-932f-b0919fa9340c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.877186 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b356c2f5-ae04-4c30-932f-b0919fa9340c" (UID: "b356c2f5-ae04-4c30-932f-b0919fa9340c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.881839 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b356c2f5-ae04-4c30-932f-b0919fa9340c" (UID: "b356c2f5-ae04-4c30-932f-b0919fa9340c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.888341 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b356c2f5-ae04-4c30-932f-b0919fa9340c" (UID: "b356c2f5-ae04-4c30-932f-b0919fa9340c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.917023 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.917064 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.917074 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.917084 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z69hq\" (UniqueName: \"kubernetes.io/projected/b356c2f5-ae04-4c30-932f-b0919fa9340c-kube-api-access-z69hq\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.917094 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:14 crc kubenswrapper[4725]: I0225 11:15:14.917104 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b356c2f5-ae04-4c30-932f-b0919fa9340c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:15 crc kubenswrapper[4725]: I0225 11:15:15.145658 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-h2f8n"] Feb 25 11:15:15 crc kubenswrapper[4725]: I0225 11:15:15.155928 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-h2f8n"] Feb 25 11:15:15 crc kubenswrapper[4725]: I0225 11:15:15.234444 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" path="/var/lib/kubelet/pods/b356c2f5-ae04-4c30-932f-b0919fa9340c/volumes" Feb 25 11:15:16 crc kubenswrapper[4725]: I0225 11:15:16.818997 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2b92e78-7b23-469e-9220-9ea38d9cba32","Type":"ContainerStarted","Data":"99e2860e0b491939e52e17eaacfd5422b312c5ceffea4e8f9b6d3b97c098eb2b"} Feb 25 11:15:16 crc kubenswrapper[4725]: I0225 11:15:16.820874 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 11:15:16 crc kubenswrapper[4725]: I0225 11:15:16.825630 4725 generic.go:334] "Generic (PLEG): container finished" podID="3538d74b-8967-41d6-b4a4-add6bf1558ad" containerID="ce808751965ad771f7787ed4ef2630ae08dddd816d343a30dd7d728b22f88733" exitCode=0 Feb 25 11:15:16 crc kubenswrapper[4725]: I0225 11:15:16.825705 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6jpc" event={"ID":"3538d74b-8967-41d6-b4a4-add6bf1558ad","Type":"ContainerDied","Data":"ce808751965ad771f7787ed4ef2630ae08dddd816d343a30dd7d728b22f88733"} Feb 25 11:15:16 crc kubenswrapper[4725]: I0225 11:15:16.854292 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.027825548 podStartE2EDuration="5.854271226s" podCreationTimestamp="2026-02-25 11:15:11 +0000 UTC" firstStartedPulling="2026-02-25 11:15:12.150024223 +0000 UTC m=+1337.648606248" lastFinishedPulling="2026-02-25 11:15:15.976469891 +0000 UTC m=+1341.475051926" observedRunningTime="2026-02-25 11:15:16.84846607 +0000 UTC m=+1342.347048105" watchObservedRunningTime="2026-02-25 11:15:16.854271226 +0000 UTC m=+1342.352853261" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.178481 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.282819 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-combined-ca-bundle\") pod \"3538d74b-8967-41d6-b4a4-add6bf1558ad\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.283223 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkxdt\" (UniqueName: \"kubernetes.io/projected/3538d74b-8967-41d6-b4a4-add6bf1558ad-kube-api-access-vkxdt\") pod \"3538d74b-8967-41d6-b4a4-add6bf1558ad\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.283301 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-config-data\") pod \"3538d74b-8967-41d6-b4a4-add6bf1558ad\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.283326 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-scripts\") pod \"3538d74b-8967-41d6-b4a4-add6bf1558ad\" (UID: \"3538d74b-8967-41d6-b4a4-add6bf1558ad\") " Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.288298 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3538d74b-8967-41d6-b4a4-add6bf1558ad-kube-api-access-vkxdt" (OuterVolumeSpecName: "kube-api-access-vkxdt") pod "3538d74b-8967-41d6-b4a4-add6bf1558ad" (UID: "3538d74b-8967-41d6-b4a4-add6bf1558ad"). InnerVolumeSpecName "kube-api-access-vkxdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.288769 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-scripts" (OuterVolumeSpecName: "scripts") pod "3538d74b-8967-41d6-b4a4-add6bf1558ad" (UID: "3538d74b-8967-41d6-b4a4-add6bf1558ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.310230 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-config-data" (OuterVolumeSpecName: "config-data") pod "3538d74b-8967-41d6-b4a4-add6bf1558ad" (UID: "3538d74b-8967-41d6-b4a4-add6bf1558ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.318794 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3538d74b-8967-41d6-b4a4-add6bf1558ad" (UID: "3538d74b-8967-41d6-b4a4-add6bf1558ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.385928 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.385963 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkxdt\" (UniqueName: \"kubernetes.io/projected/3538d74b-8967-41d6-b4a4-add6bf1558ad-kube-api-access-vkxdt\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.385972 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.385983 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3538d74b-8967-41d6-b4a4-add6bf1558ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.851506 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v6jpc" event={"ID":"3538d74b-8967-41d6-b4a4-add6bf1558ad","Type":"ContainerDied","Data":"abdad035e75dc3b1ceae60daeefa5a75be428337ac63b6f60241dbf3194b5fb1"} Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.851555 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v6jpc" Feb 25 11:15:18 crc kubenswrapper[4725]: I0225 11:15:18.851571 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abdad035e75dc3b1ceae60daeefa5a75be428337ac63b6f60241dbf3194b5fb1" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.097326 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.097988 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f6b00346-164e-4c93-8333-5c1b47ee5ea9" containerName="nova-scheduler-scheduler" containerID="cri-o://f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f" gracePeriod=30 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.113545 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.114282 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-log" containerID="cri-o://be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe" gracePeriod=30 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.114513 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-api" containerID="cri-o://9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95" gracePeriod=30 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.155363 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.155639 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-log" containerID="cri-o://ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962" gracePeriod=30 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.155797 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-metadata" containerID="cri-o://4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0" gracePeriod=30 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.626611 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-h2f8n" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.196:5353: i/o timeout" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.675624 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.817136 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-config-data\") pod \"ebaf12e6-82fc-4885-9428-b186379c6009\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.817277 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-internal-tls-certs\") pod \"ebaf12e6-82fc-4885-9428-b186379c6009\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.817322 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-combined-ca-bundle\") pod \"ebaf12e6-82fc-4885-9428-b186379c6009\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.817352 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-public-tls-certs\") pod \"ebaf12e6-82fc-4885-9428-b186379c6009\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.817418 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaf12e6-82fc-4885-9428-b186379c6009-logs\") pod \"ebaf12e6-82fc-4885-9428-b186379c6009\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.817443 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnfx4\" (UniqueName: \"kubernetes.io/projected/ebaf12e6-82fc-4885-9428-b186379c6009-kube-api-access-rnfx4\") pod \"ebaf12e6-82fc-4885-9428-b186379c6009\" (UID: \"ebaf12e6-82fc-4885-9428-b186379c6009\") " Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.817859 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebaf12e6-82fc-4885-9428-b186379c6009-logs" (OuterVolumeSpecName: "logs") pod "ebaf12e6-82fc-4885-9428-b186379c6009" (UID: "ebaf12e6-82fc-4885-9428-b186379c6009"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.823251 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebaf12e6-82fc-4885-9428-b186379c6009-kube-api-access-rnfx4" (OuterVolumeSpecName: "kube-api-access-rnfx4") pod "ebaf12e6-82fc-4885-9428-b186379c6009" (UID: "ebaf12e6-82fc-4885-9428-b186379c6009"). InnerVolumeSpecName "kube-api-access-rnfx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.853086 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebaf12e6-82fc-4885-9428-b186379c6009" (UID: "ebaf12e6-82fc-4885-9428-b186379c6009"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.879128 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-config-data" (OuterVolumeSpecName: "config-data") pod "ebaf12e6-82fc-4885-9428-b186379c6009" (UID: "ebaf12e6-82fc-4885-9428-b186379c6009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.887843 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ebaf12e6-82fc-4885-9428-b186379c6009" (UID: "ebaf12e6-82fc-4885-9428-b186379c6009"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.894583 4725 generic.go:334] "Generic (PLEG): container finished" podID="ebaf12e6-82fc-4885-9428-b186379c6009" containerID="9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95" exitCode=0 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.894622 4725 generic.go:334] "Generic (PLEG): container finished" podID="ebaf12e6-82fc-4885-9428-b186379c6009" containerID="be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe" exitCode=143 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.894701 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.894725 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebaf12e6-82fc-4885-9428-b186379c6009","Type":"ContainerDied","Data":"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95"} Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.895207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebaf12e6-82fc-4885-9428-b186379c6009","Type":"ContainerDied","Data":"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe"} Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.895226 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebaf12e6-82fc-4885-9428-b186379c6009","Type":"ContainerDied","Data":"63420fad53cddcd5d6727f3963e87b06658902ff32a53932ca45fa8df76525ef"} Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.895255 4725 scope.go:117] "RemoveContainer" containerID="9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.907194 4725 generic.go:334] "Generic (PLEG): container finished" podID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerID="ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962" exitCode=143 Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.907276 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a","Type":"ContainerDied","Data":"ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962"} Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.909990 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ebaf12e6-82fc-4885-9428-b186379c6009" (UID: "ebaf12e6-82fc-4885-9428-b186379c6009"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.919948 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.919994 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaf12e6-82fc-4885-9428-b186379c6009-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.920006 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnfx4\" (UniqueName: \"kubernetes.io/projected/ebaf12e6-82fc-4885-9428-b186379c6009-kube-api-access-rnfx4\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.920017 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.920025 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.920033 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaf12e6-82fc-4885-9428-b186379c6009-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.922817 4725 scope.go:117] "RemoveContainer" containerID="be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.954371 4725 scope.go:117] "RemoveContainer" containerID="9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95" Feb 25 11:15:19 crc kubenswrapper[4725]: E0225 11:15:19.955061 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95\": container with ID starting with 9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95 not found: ID does not exist" containerID="9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.955126 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95"} err="failed to get container status \"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95\": rpc error: code = NotFound desc = could not find container \"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95\": container with ID starting with 9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95 not found: ID does not exist" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.955361 4725 scope.go:117] "RemoveContainer" containerID="be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe" Feb 25 11:15:19 crc kubenswrapper[4725]: E0225 11:15:19.955751 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe\": container with ID starting with be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe not found: ID does not exist" containerID="be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.955806 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe"} err="failed to get container status \"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe\": rpc error: code = NotFound desc = could not find container \"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe\": container with ID starting with be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe not found: ID does not exist" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.955899 4725 scope.go:117] "RemoveContainer" containerID="9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.956308 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95"} err="failed to get container status \"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95\": rpc error: code = NotFound desc = could not find container \"9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95\": container with ID starting with 9b39a39a84e6c67854008bd8970a23ccf55c9868d41b86a03a838213f5c92b95 not found: ID does not exist" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.956397 4725 scope.go:117] "RemoveContainer" containerID="be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe" Feb 25 11:15:19 crc kubenswrapper[4725]: I0225 11:15:19.956716 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe"} err="failed to get container status \"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe\": rpc error: code = NotFound desc = could not find container \"be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe\": container with ID starting with be9189c6e1396125d9f0d1d237b696d1edfe77334108f731d09f0720c91575fe not found: ID does not exist" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.222971 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.233589 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258118 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:20 crc kubenswrapper[4725]: E0225 11:15:20.258475 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-api" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258487 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-api" Feb 25 11:15:20 crc kubenswrapper[4725]: E0225 11:15:20.258498 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-log" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258504 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-log" Feb 25 11:15:20 crc kubenswrapper[4725]: E0225 11:15:20.258526 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerName="dnsmasq-dns" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258531 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerName="dnsmasq-dns" Feb 25 11:15:20 crc kubenswrapper[4725]: E0225 11:15:20.258552 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerName="init" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258558 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerName="init" Feb 25 11:15:20 crc kubenswrapper[4725]: E0225 11:15:20.258567 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3538d74b-8967-41d6-b4a4-add6bf1558ad" containerName="nova-manage" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258573 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3538d74b-8967-41d6-b4a4-add6bf1558ad" containerName="nova-manage" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258722 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3538d74b-8967-41d6-b4a4-add6bf1558ad" containerName="nova-manage" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258733 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-log" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258747 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b356c2f5-ae04-4c30-932f-b0919fa9340c" containerName="dnsmasq-dns" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.258754 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" containerName="nova-api-api" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.259663 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.265145 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.267469 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.268680 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.277379 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.430765 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-config-data\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.431167 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.431226 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e95c876-3305-4b1d-9062-dffe7e184ffd-logs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.431373 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.431451 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.431546 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vq7m\" (UniqueName: \"kubernetes.io/projected/0e95c876-3305-4b1d-9062-dffe7e184ffd-kube-api-access-7vq7m\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.533464 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-config-data\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.533514 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.533544 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e95c876-3305-4b1d-9062-dffe7e184ffd-logs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.533591 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.533625 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.533657 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vq7m\" (UniqueName: \"kubernetes.io/projected/0e95c876-3305-4b1d-9062-dffe7e184ffd-kube-api-access-7vq7m\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.534333 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e95c876-3305-4b1d-9062-dffe7e184ffd-logs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.537553 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.538276 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-config-data\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.539783 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.541001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e95c876-3305-4b1d-9062-dffe7e184ffd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.554338 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vq7m\" (UniqueName: \"kubernetes.io/projected/0e95c876-3305-4b1d-9062-dffe7e184ffd-kube-api-access-7vq7m\") pod \"nova-api-0\" (UID: \"0e95c876-3305-4b1d-9062-dffe7e184ffd\") " pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.606544 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.631519 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.740485 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tst22\" (UniqueName: \"kubernetes.io/projected/f6b00346-164e-4c93-8333-5c1b47ee5ea9-kube-api-access-tst22\") pod \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.740887 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-combined-ca-bundle\") pod \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.740921 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-config-data\") pod \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\" (UID: \"f6b00346-164e-4c93-8333-5c1b47ee5ea9\") " Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.745389 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b00346-164e-4c93-8333-5c1b47ee5ea9-kube-api-access-tst22" (OuterVolumeSpecName: "kube-api-access-tst22") pod "f6b00346-164e-4c93-8333-5c1b47ee5ea9" (UID: "f6b00346-164e-4c93-8333-5c1b47ee5ea9"). InnerVolumeSpecName "kube-api-access-tst22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.776368 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6b00346-164e-4c93-8333-5c1b47ee5ea9" (UID: "f6b00346-164e-4c93-8333-5c1b47ee5ea9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.780345 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-config-data" (OuterVolumeSpecName: "config-data") pod "f6b00346-164e-4c93-8333-5c1b47ee5ea9" (UID: "f6b00346-164e-4c93-8333-5c1b47ee5ea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.843981 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tst22\" (UniqueName: \"kubernetes.io/projected/f6b00346-164e-4c93-8333-5c1b47ee5ea9-kube-api-access-tst22\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.844014 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.844025 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b00346-164e-4c93-8333-5c1b47ee5ea9-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.917755 4725 generic.go:334] "Generic (PLEG): container finished" podID="f6b00346-164e-4c93-8333-5c1b47ee5ea9" containerID="f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f" exitCode=0 Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.917820 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.917855 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6b00346-164e-4c93-8333-5c1b47ee5ea9","Type":"ContainerDied","Data":"f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f"} Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.917978 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6b00346-164e-4c93-8333-5c1b47ee5ea9","Type":"ContainerDied","Data":"1492f6e59bb2b232ea7f64fed5420f2791172abce5aaf69ee875846e56690f59"} Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.917996 4725 scope.go:117] "RemoveContainer" containerID="f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.951933 4725 scope.go:117] "RemoveContainer" containerID="f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f" Feb 25 11:15:20 crc kubenswrapper[4725]: E0225 11:15:20.952423 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f\": container with ID starting with f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f not found: ID does not exist" containerID="f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.952466 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f"} err="failed to get container status \"f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f\": rpc error: code = NotFound desc = could not find container \"f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f\": container with ID starting with f83adab039425a64543e8d0463aa0e71b46175450deb373d39696690a9b5af1f not found: ID does not exist" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.953284 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.967560 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.978574 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:15:20 crc kubenswrapper[4725]: E0225 11:15:20.979150 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b00346-164e-4c93-8333-5c1b47ee5ea9" containerName="nova-scheduler-scheduler" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.979171 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b00346-164e-4c93-8333-5c1b47ee5ea9" containerName="nova-scheduler-scheduler" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.979415 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b00346-164e-4c93-8333-5c1b47ee5ea9" containerName="nova-scheduler-scheduler" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.980140 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.982514 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 11:15:20 crc kubenswrapper[4725]: I0225 11:15:20.989712 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.072819 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 11:15:21 crc kubenswrapper[4725]: W0225 11:15:21.073975 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e95c876_3305_4b1d_9062_dffe7e184ffd.slice/crio-eb08e0fc6576007e1a9d4846cf547b7b74f74b16ff0c2f8beaccd8d8afc05e38 WatchSource:0}: Error finding container eb08e0fc6576007e1a9d4846cf547b7b74f74b16ff0c2f8beaccd8d8afc05e38: Status 404 returned error can't find the container with id eb08e0fc6576007e1a9d4846cf547b7b74f74b16ff0c2f8beaccd8d8afc05e38 Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.150769 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qscps\" (UniqueName: \"kubernetes.io/projected/fe5c0a24-642c-4173-9b00-3d5a327f669e-kube-api-access-qscps\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.150963 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5c0a24-642c-4173-9b00-3d5a327f669e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.151001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5c0a24-642c-4173-9b00-3d5a327f669e-config-data\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.235058 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebaf12e6-82fc-4885-9428-b186379c6009" path="/var/lib/kubelet/pods/ebaf12e6-82fc-4885-9428-b186379c6009/volumes" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.236269 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b00346-164e-4c93-8333-5c1b47ee5ea9" path="/var/lib/kubelet/pods/f6b00346-164e-4c93-8333-5c1b47ee5ea9/volumes" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.252320 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5c0a24-642c-4173-9b00-3d5a327f669e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.252364 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5c0a24-642c-4173-9b00-3d5a327f669e-config-data\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.252417 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qscps\" (UniqueName: \"kubernetes.io/projected/fe5c0a24-642c-4173-9b00-3d5a327f669e-kube-api-access-qscps\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.256228 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5c0a24-642c-4173-9b00-3d5a327f669e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.265718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5c0a24-642c-4173-9b00-3d5a327f669e-config-data\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.269526 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qscps\" (UniqueName: \"kubernetes.io/projected/fe5c0a24-642c-4173-9b00-3d5a327f669e-kube-api-access-qscps\") pod \"nova-scheduler-0\" (UID: \"fe5c0a24-642c-4173-9b00-3d5a327f669e\") " pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.303579 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.730164 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 11:15:21 crc kubenswrapper[4725]: W0225 11:15:21.737763 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5c0a24_642c_4173_9b00_3d5a327f669e.slice/crio-c29d8ccf39c288a6eac3614f2e2ab6127ce7406a614690c711b3fd65edd66c6a WatchSource:0}: Error finding container c29d8ccf39c288a6eac3614f2e2ab6127ce7406a614690c711b3fd65edd66c6a: Status 404 returned error can't find the container with id c29d8ccf39c288a6eac3614f2e2ab6127ce7406a614690c711b3fd65edd66c6a Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.940452 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e95c876-3305-4b1d-9062-dffe7e184ffd","Type":"ContainerStarted","Data":"b25d6e0dd6a43e60648a3f5ca9028c1297e3a55ccb9dd5a05ea36e710f4b4b25"} Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.941039 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e95c876-3305-4b1d-9062-dffe7e184ffd","Type":"ContainerStarted","Data":"649fed4d95ce0487e0cd027cbf301a2f26214be8170fd3b323a578791c14ce65"} Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.941237 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e95c876-3305-4b1d-9062-dffe7e184ffd","Type":"ContainerStarted","Data":"eb08e0fc6576007e1a9d4846cf547b7b74f74b16ff0c2f8beaccd8d8afc05e38"} Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.949801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe5c0a24-642c-4173-9b00-3d5a327f669e","Type":"ContainerStarted","Data":"c29d8ccf39c288a6eac3614f2e2ab6127ce7406a614690c711b3fd65edd66c6a"} Feb 25 11:15:21 crc kubenswrapper[4725]: I0225 11:15:21.967046 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.967019497 podStartE2EDuration="1.967019497s" podCreationTimestamp="2026-02-25 11:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:21.963803471 +0000 UTC m=+1347.462385526" watchObservedRunningTime="2026-02-25 11:15:21.967019497 +0000 UTC m=+1347.465601532" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.316239 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:41158->10.217.0.201:8775: read: connection reset by peer" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.316235 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:41162->10.217.0.201:8775: read: connection reset by peer" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.822927 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.965673 4725 generic.go:334] "Generic (PLEG): container finished" podID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerID="4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0" exitCode=0 Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.965739 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a","Type":"ContainerDied","Data":"4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0"} Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.965767 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.965784 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a","Type":"ContainerDied","Data":"6aa5307de5eb33cec5c5b501c21ead13319b642d654e0936a0349776eaab9390"} Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.965805 4725 scope.go:117] "RemoveContainer" containerID="4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.967623 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe5c0a24-642c-4173-9b00-3d5a327f669e","Type":"ContainerStarted","Data":"44eb28fdede74d7067c731f4317d8756c7c0c377f4713855cf3a42d9128be157"} Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.990595 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b964l\" (UniqueName: \"kubernetes.io/projected/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-kube-api-access-b964l\") pod \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.990720 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-logs\") pod \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.990856 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-combined-ca-bundle\") pod \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.991046 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-config-data\") pod \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.991102 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-nova-metadata-tls-certs\") pod \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\" (UID: \"7facde5c-b0f0-4cbd-994c-15eb5a9ac57a\") " Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.994419 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-logs" (OuterVolumeSpecName: "logs") pod "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" (UID: "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.995541 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9955219189999998 podStartE2EDuration="2.995521919s" podCreationTimestamp="2026-02-25 11:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:22.992229961 +0000 UTC m=+1348.490811996" watchObservedRunningTime="2026-02-25 11:15:22.995521919 +0000 UTC m=+1348.494103944" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.997916 4725 scope.go:117] "RemoveContainer" containerID="ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962" Feb 25 11:15:22 crc kubenswrapper[4725]: I0225 11:15:22.998599 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-kube-api-access-b964l" (OuterVolumeSpecName: "kube-api-access-b964l") pod "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" (UID: "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a"). InnerVolumeSpecName "kube-api-access-b964l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.019807 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" (UID: "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.032793 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-config-data" (OuterVolumeSpecName: "config-data") pod "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" (UID: "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.060966 4725 scope.go:117] "RemoveContainer" containerID="4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0" Feb 25 11:15:23 crc kubenswrapper[4725]: E0225 11:15:23.061546 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0\": container with ID starting with 4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0 not found: ID does not exist" containerID="4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.061575 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0"} err="failed to get container status \"4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0\": rpc error: code = NotFound desc = could not find container \"4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0\": container with ID starting with 4f275b37b591811e75123b2abc58f0f1b123105067e9d725d8dd30b766687bc0 not found: ID does not exist" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.061594 4725 scope.go:117] "RemoveContainer" containerID="ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962" Feb 25 11:15:23 crc kubenswrapper[4725]: E0225 11:15:23.061900 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962\": container with ID starting with ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962 not found: ID does not exist" containerID="ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.061921 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962"} err="failed to get container status \"ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962\": rpc error: code = NotFound desc = could not find container \"ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962\": container with ID starting with ad4d45ace18982ff045e944b92ce054c42592b60038d0fca4fb7f2fdefa55962 not found: ID does not exist" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.066798 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" (UID: "7facde5c-b0f0-4cbd-994c-15eb5a9ac57a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.097183 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.097216 4725 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.097230 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b964l\" (UniqueName: \"kubernetes.io/projected/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-kube-api-access-b964l\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.097241 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-logs\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.097252 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.330602 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.356405 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.372808 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:15:23 crc kubenswrapper[4725]: E0225 11:15:23.373736 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-log" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.373915 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-log" Feb 25 11:15:23 crc kubenswrapper[4725]: E0225 11:15:23.374088 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-metadata" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.374260 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-metadata" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.374785 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-log" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.374965 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" containerName="nova-metadata-metadata" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.376778 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.379490 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.379716 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.397503 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.506878 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5nkn\" (UniqueName: \"kubernetes.io/projected/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-kube-api-access-b5nkn\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.507209 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.507278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-config-data\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.507310 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-logs\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.507342 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.609329 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.609425 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5nkn\" (UniqueName: \"kubernetes.io/projected/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-kube-api-access-b5nkn\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.609449 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.609510 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-config-data\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.609540 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-logs\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.609900 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-logs\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.613966 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-config-data\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.614270 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.614358 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.629594 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5nkn\" (UniqueName: \"kubernetes.io/projected/670a8e0c-fb4b-4311-b236-41a3f10c1ad2-kube-api-access-b5nkn\") pod \"nova-metadata-0\" (UID: \"670a8e0c-fb4b-4311-b236-41a3f10c1ad2\") " pod="openstack/nova-metadata-0" Feb 25 11:15:23 crc kubenswrapper[4725]: I0225 11:15:23.695129 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 11:15:24 crc kubenswrapper[4725]: I0225 11:15:24.159736 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 11:15:24 crc kubenswrapper[4725]: W0225 11:15:24.159899 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod670a8e0c_fb4b_4311_b236_41a3f10c1ad2.slice/crio-0bd852138810ad7a03fedb0730f3057ff76033c6daef7bbe75d884c46af84584 WatchSource:0}: Error finding container 0bd852138810ad7a03fedb0730f3057ff76033c6daef7bbe75d884c46af84584: Status 404 returned error can't find the container with id 0bd852138810ad7a03fedb0730f3057ff76033c6daef7bbe75d884c46af84584 Feb 25 11:15:24 crc kubenswrapper[4725]: I0225 11:15:24.997115 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"670a8e0c-fb4b-4311-b236-41a3f10c1ad2","Type":"ContainerStarted","Data":"7391e798163044d04e909af1cf2d7bfd4a0b3879d039316e6061401ea76f1fda"} Feb 25 11:15:24 crc kubenswrapper[4725]: I0225 11:15:24.997431 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"670a8e0c-fb4b-4311-b236-41a3f10c1ad2","Type":"ContainerStarted","Data":"f915f080a5532fc34fb040e56d1daeb09bc5f8cc276301fc3724006043237fd8"} Feb 25 11:15:24 crc kubenswrapper[4725]: I0225 11:15:24.997444 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"670a8e0c-fb4b-4311-b236-41a3f10c1ad2","Type":"ContainerStarted","Data":"0bd852138810ad7a03fedb0730f3057ff76033c6daef7bbe75d884c46af84584"} Feb 25 11:15:25 crc kubenswrapper[4725]: I0225 11:15:25.067950 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.067925836 podStartE2EDuration="2.067925836s" podCreationTimestamp="2026-02-25 11:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:15:25.060059675 +0000 UTC m=+1350.558641730" watchObservedRunningTime="2026-02-25 11:15:25.067925836 +0000 UTC m=+1350.566507871" Feb 25 11:15:25 crc kubenswrapper[4725]: I0225 11:15:25.244682 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7facde5c-b0f0-4cbd-994c-15eb5a9ac57a" path="/var/lib/kubelet/pods/7facde5c-b0f0-4cbd-994c-15eb5a9ac57a/volumes" Feb 25 11:15:26 crc kubenswrapper[4725]: I0225 11:15:26.304318 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 11:15:28 crc kubenswrapper[4725]: I0225 11:15:28.696026 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:15:28 crc kubenswrapper[4725]: I0225 11:15:28.696429 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 11:15:30 crc kubenswrapper[4725]: I0225 11:15:30.608208 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 11:15:30 crc kubenswrapper[4725]: I0225 11:15:30.608636 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 11:15:31 crc kubenswrapper[4725]: I0225 11:15:31.304282 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 11:15:31 crc kubenswrapper[4725]: I0225 11:15:31.352655 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 11:15:31 crc kubenswrapper[4725]: I0225 11:15:31.628109 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e95c876-3305-4b1d-9062-dffe7e184ffd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 11:15:31 crc kubenswrapper[4725]: I0225 11:15:31.628121 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e95c876-3305-4b1d-9062-dffe7e184ffd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 11:15:32 crc kubenswrapper[4725]: I0225 11:15:32.115774 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 11:15:33 crc kubenswrapper[4725]: I0225 11:15:33.695335 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 11:15:33 crc kubenswrapper[4725]: I0225 11:15:33.695856 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 11:15:34 crc kubenswrapper[4725]: I0225 11:15:34.778016 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="670a8e0c-fb4b-4311-b236-41a3f10c1ad2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 11:15:34 crc kubenswrapper[4725]: I0225 11:15:34.778037 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="670a8e0c-fb4b-4311-b236-41a3f10c1ad2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 11:15:40 crc kubenswrapper[4725]: I0225 11:15:40.617564 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 11:15:40 crc kubenswrapper[4725]: I0225 11:15:40.618517 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 11:15:40 crc kubenswrapper[4725]: I0225 11:15:40.621503 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 11:15:40 crc kubenswrapper[4725]: I0225 11:15:40.633650 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 11:15:41 crc kubenswrapper[4725]: I0225 11:15:41.194227 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 11:15:41 crc kubenswrapper[4725]: I0225 11:15:41.209135 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 11:15:41 crc kubenswrapper[4725]: I0225 11:15:41.555435 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:15:41 crc kubenswrapper[4725]: I0225 11:15:41.555774 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:15:41 crc kubenswrapper[4725]: I0225 11:15:41.650393 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 11:15:43 crc kubenswrapper[4725]: I0225 11:15:43.699400 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 11:15:43 crc kubenswrapper[4725]: I0225 11:15:43.707851 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 11:15:43 crc kubenswrapper[4725]: I0225 11:15:43.745199 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 11:15:44 crc kubenswrapper[4725]: I0225 11:15:44.230702 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 11:15:51 crc kubenswrapper[4725]: I0225 11:15:51.912284 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:15:53 crc kubenswrapper[4725]: I0225 11:15:53.525618 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:15:56 crc kubenswrapper[4725]: I0225 11:15:56.187611 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerName="rabbitmq" containerID="cri-o://68acb62c236cce60fe0e6b8ce02f29b116f03427200c51be4c7cdd38ee606404" gracePeriod=604796 Feb 25 11:15:58 crc kubenswrapper[4725]: I0225 11:15:58.139970 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="rabbitmq" containerID="cri-o://ba1270fb11896d23a9d4c55ad713140436475dae01dfee53fe8721ec435833ea" gracePeriod=604796 Feb 25 11:15:58 crc kubenswrapper[4725]: I0225 11:15:58.977689 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Feb 25 11:15:59 crc kubenswrapper[4725]: I0225 11:15:59.250087 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.169752 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533636-7qvvt"] Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.171658 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.180983 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.181011 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.181331 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.190495 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533636-7qvvt"] Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.284540 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjldn\" (UniqueName: \"kubernetes.io/projected/63a6913c-322e-4be4-acd7-29a649757554-kube-api-access-sjldn\") pod \"auto-csr-approver-29533636-7qvvt\" (UID: \"63a6913c-322e-4be4-acd7-29a649757554\") " pod="openshift-infra/auto-csr-approver-29533636-7qvvt" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.386809 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjldn\" (UniqueName: \"kubernetes.io/projected/63a6913c-322e-4be4-acd7-29a649757554-kube-api-access-sjldn\") pod \"auto-csr-approver-29533636-7qvvt\" (UID: \"63a6913c-322e-4be4-acd7-29a649757554\") " pod="openshift-infra/auto-csr-approver-29533636-7qvvt" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.429775 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjldn\" (UniqueName: \"kubernetes.io/projected/63a6913c-322e-4be4-acd7-29a649757554-kube-api-access-sjldn\") pod \"auto-csr-approver-29533636-7qvvt\" (UID: \"63a6913c-322e-4be4-acd7-29a649757554\") " pod="openshift-infra/auto-csr-approver-29533636-7qvvt" Feb 25 11:16:00 crc kubenswrapper[4725]: I0225 11:16:00.502442 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" Feb 25 11:16:01 crc kubenswrapper[4725]: I0225 11:16:01.032455 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533636-7qvvt"] Feb 25 11:16:01 crc kubenswrapper[4725]: I0225 11:16:01.411687 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" event={"ID":"63a6913c-322e-4be4-acd7-29a649757554","Type":"ContainerStarted","Data":"5f3f7e558444f250d1b371fcd1a3611779e0c0daa03984bb874da4ace1f1bffa"} Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.425096 4725 generic.go:334] "Generic (PLEG): container finished" podID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerID="68acb62c236cce60fe0e6b8ce02f29b116f03427200c51be4c7cdd38ee606404" exitCode=0 Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.425163 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57e7a103-f119-4d8e-bb7f-96f36b66994e","Type":"ContainerDied","Data":"68acb62c236cce60fe0e6b8ce02f29b116f03427200c51be4c7cdd38ee606404"} Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.427470 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" event={"ID":"63a6913c-322e-4be4-acd7-29a649757554","Type":"ContainerStarted","Data":"3cdaf2838439ac380a611605ecaa3171e841cf7a49575bfd2d7230d9cc03c5d6"} Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.458303 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" podStartSLOduration=1.596380825 podStartE2EDuration="2.458276946s" podCreationTimestamp="2026-02-25 11:16:00 +0000 UTC" firstStartedPulling="2026-02-25 11:16:01.026775887 +0000 UTC m=+1386.525357912" lastFinishedPulling="2026-02-25 11:16:01.888672008 +0000 UTC m=+1387.387254033" observedRunningTime="2026-02-25 11:16:02.446295765 +0000 UTC m=+1387.944877810" watchObservedRunningTime="2026-02-25 11:16:02.458276946 +0000 UTC m=+1387.956858971" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.862296 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.936672 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-config-data\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.936764 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-server-conf\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.936873 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937089 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-tls\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937147 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57e7a103-f119-4d8e-bb7f-96f36b66994e-erlang-cookie-secret\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937210 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57e7a103-f119-4d8e-bb7f-96f36b66994e-pod-info\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937253 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-plugins\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937293 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-confd\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937331 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-plugins-conf\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937387 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv6dw\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-kube-api-access-tv6dw\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.937435 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-erlang-cookie\") pod \"57e7a103-f119-4d8e-bb7f-96f36b66994e\" (UID: \"57e7a103-f119-4d8e-bb7f-96f36b66994e\") " Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.938900 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.939358 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.939463 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.947694 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/57e7a103-f119-4d8e-bb7f-96f36b66994e-pod-info" (OuterVolumeSpecName: "pod-info") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.949135 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.950279 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.971800 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-kube-api-access-tv6dw" (OuterVolumeSpecName: "kube-api-access-tv6dw") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "kube-api-access-tv6dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.981465 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e7a103-f119-4d8e-bb7f-96f36b66994e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:16:02 crc kubenswrapper[4725]: I0225 11:16:02.989230 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-config-data" (OuterVolumeSpecName: "config-data") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.024373 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-server-conf" (OuterVolumeSpecName: "server-conf") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041243 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041275 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv6dw\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-kube-api-access-tv6dw\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041285 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041296 4725 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-server-conf\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041318 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041328 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041336 4725 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57e7a103-f119-4d8e-bb7f-96f36b66994e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041344 4725 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57e7a103-f119-4d8e-bb7f-96f36b66994e-pod-info\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041352 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.041361 4725 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57e7a103-f119-4d8e-bb7f-96f36b66994e-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.064196 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.094075 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "57e7a103-f119-4d8e-bb7f-96f36b66994e" (UID: "57e7a103-f119-4d8e-bb7f-96f36b66994e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.142896 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57e7a103-f119-4d8e-bb7f-96f36b66994e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.142924 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.435946 4725 generic.go:334] "Generic (PLEG): container finished" podID="63a6913c-322e-4be4-acd7-29a649757554" containerID="3cdaf2838439ac380a611605ecaa3171e841cf7a49575bfd2d7230d9cc03c5d6" exitCode=0 Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.436009 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" event={"ID":"63a6913c-322e-4be4-acd7-29a649757554","Type":"ContainerDied","Data":"3cdaf2838439ac380a611605ecaa3171e841cf7a49575bfd2d7230d9cc03c5d6"} Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.442039 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57e7a103-f119-4d8e-bb7f-96f36b66994e","Type":"ContainerDied","Data":"b3ce4e075f97844980fb715be3c14b371f81502dcee373e2ec83f9b4b7e04b07"} Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.442086 4725 scope.go:117] "RemoveContainer" containerID="68acb62c236cce60fe0e6b8ce02f29b116f03427200c51be4c7cdd38ee606404" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.442213 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.475687 4725 scope.go:117] "RemoveContainer" containerID="65bb35575781bad2e98c04d4e1b97efb65e9db76bd69365abd39ee6385396cf2" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.475913 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.484172 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.500167 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:16:03 crc kubenswrapper[4725]: E0225 11:16:03.500520 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerName="setup-container" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.500539 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerName="setup-container" Feb 25 11:16:03 crc kubenswrapper[4725]: E0225 11:16:03.500558 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerName="rabbitmq" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.500563 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerName="rabbitmq" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.500740 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" containerName="rabbitmq" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.501591 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.505397 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.506727 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.507234 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.507466 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mmfh7" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.507558 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.507644 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.507715 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.515816 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.659866 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.659940 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.659977 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660009 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8cd71ea0-569c-4093-931d-2e0c841bcbf4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660037 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660066 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660092 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj5l5\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-kube-api-access-kj5l5\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660113 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-config-data\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660143 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660201 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8cd71ea0-569c-4093-931d-2e0c841bcbf4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.660227 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762018 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762163 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8cd71ea0-569c-4093-931d-2e0c841bcbf4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762201 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762266 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762381 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762429 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8cd71ea0-569c-4093-931d-2e0c841bcbf4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762473 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762512 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762564 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj5l5\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-kube-api-access-kj5l5\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.762598 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-config-data\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.763843 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.765128 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.765353 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.766382 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.767294 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.767479 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cd71ea0-569c-4093-931d-2e0c841bcbf4-config-data\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.768368 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.770138 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.771447 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8cd71ea0-569c-4093-931d-2e0c841bcbf4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.771539 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8cd71ea0-569c-4093-931d-2e0c841bcbf4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.799587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj5l5\" (UniqueName: \"kubernetes.io/projected/8cd71ea0-569c-4093-931d-2e0c841bcbf4-kube-api-access-kj5l5\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.807258 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"8cd71ea0-569c-4093-931d-2e0c841bcbf4\") " pod="openstack/rabbitmq-server-0" Feb 25 11:16:03 crc kubenswrapper[4725]: I0225 11:16:03.844620 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.375433 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.454129 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8cd71ea0-569c-4093-931d-2e0c841bcbf4","Type":"ContainerStarted","Data":"53023cca1d5f783e7fd4d5d3bb0736c3005158e8302298c233fd365e3f835fed"} Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.461602 4725 generic.go:334] "Generic (PLEG): container finished" podID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerID="ba1270fb11896d23a9d4c55ad713140436475dae01dfee53fe8721ec435833ea" exitCode=0 Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.461671 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d1a511fd-4696-456a-8263-da4cd2f5eff1","Type":"ContainerDied","Data":"ba1270fb11896d23a9d4c55ad713140436475dae01dfee53fe8721ec435833ea"} Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.673441 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785093 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-erlang-cookie\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785159 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-confd\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785183 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-server-conf\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785228 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d1a511fd-4696-456a-8263-da4cd2f5eff1-erlang-cookie-secret\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785266 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d1a511fd-4696-456a-8263-da4cd2f5eff1-pod-info\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785302 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-config-data\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785330 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-plugins-conf\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785371 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785402 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-tls\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785491 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-plugins\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.785546 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq254\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-kube-api-access-jq254\") pod \"d1a511fd-4696-456a-8263-da4cd2f5eff1\" (UID: \"d1a511fd-4696-456a-8263-da4cd2f5eff1\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.786209 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.786475 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.786518 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.789985 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a511fd-4696-456a-8263-da4cd2f5eff1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.790441 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-kube-api-access-jq254" (OuterVolumeSpecName: "kube-api-access-jq254") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "kube-api-access-jq254". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.790503 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d1a511fd-4696-456a-8263-da4cd2f5eff1-pod-info" (OuterVolumeSpecName: "pod-info") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.790579 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.792440 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.833039 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-config-data" (OuterVolumeSpecName: "config-data") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.848492 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-server-conf" (OuterVolumeSpecName: "server-conf") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887507 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887535 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq254\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-kube-api-access-jq254\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887546 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887561 4725 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-server-conf\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887571 4725 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d1a511fd-4696-456a-8263-da4cd2f5eff1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887579 4725 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d1a511fd-4696-456a-8263-da4cd2f5eff1-pod-info\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887587 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887595 4725 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d1a511fd-4696-456a-8263-da4cd2f5eff1-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887624 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.887634 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.901182 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d1a511fd-4696-456a-8263-da4cd2f5eff1" (UID: "d1a511fd-4696-456a-8263-da4cd2f5eff1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.910809 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.989223 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:04.989262 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d1a511fd-4696-456a-8263-da4cd2f5eff1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.239400 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e7a103-f119-4d8e-bb7f-96f36b66994e" path="/var/lib/kubelet/pods/57e7a103-f119-4d8e-bb7f-96f36b66994e/volumes" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.430852 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-spjhq"] Feb 25 11:16:05 crc kubenswrapper[4725]: E0225 11:16:05.431407 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="setup-container" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.431421 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="setup-container" Feb 25 11:16:05 crc kubenswrapper[4725]: E0225 11:16:05.431450 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="rabbitmq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.431457 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="rabbitmq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.431627 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" containerName="rabbitmq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.432478 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.435295 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.449955 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-spjhq"] Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.481902 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d1a511fd-4696-456a-8263-da4cd2f5eff1","Type":"ContainerDied","Data":"d4202ac9b5fd5d7a9e4d48adcaad80437647eb8924a79f2d618a7d282debb486"} Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.481958 4725 scope.go:117] "RemoveContainer" containerID="ba1270fb11896d23a9d4c55ad713140436475dae01dfee53fe8721ec435833ea" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.482112 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.489240 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.495492 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" event={"ID":"63a6913c-322e-4be4-acd7-29a649757554","Type":"ContainerDied","Data":"5f3f7e558444f250d1b371fcd1a3611779e0c0daa03984bb874da4ace1f1bffa"} Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.495534 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3f7e558444f250d1b371fcd1a3611779e0c0daa03984bb874da4ace1f1bffa" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.498864 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.498904 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.498927 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-config\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.498988 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.499007 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.499024 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.499084 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxdc\" (UniqueName: \"kubernetes.io/projected/c9a23d62-04c9-41c8-a213-fb9d604a1494-kube-api-access-mtxdc\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.518023 4725 scope.go:117] "RemoveContainer" containerID="6d69b6d7376a54b89e12188a0e9f6681be6c795c0ea23114c746f56b5175501a" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.569941 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.578123 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.599904 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:16:05 crc kubenswrapper[4725]: E0225 11:16:05.600335 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a6913c-322e-4be4-acd7-29a649757554" containerName="oc" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.600355 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a6913c-322e-4be4-acd7-29a649757554" containerName="oc" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.600570 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a6913c-322e-4be4-acd7-29a649757554" containerName="oc" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.602992 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjldn\" (UniqueName: \"kubernetes.io/projected/63a6913c-322e-4be4-acd7-29a649757554-kube-api-access-sjldn\") pod \"63a6913c-322e-4be4-acd7-29a649757554\" (UID: \"63a6913c-322e-4be4-acd7-29a649757554\") " Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.603336 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.603386 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.603413 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-config\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.603488 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.603509 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.603530 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.603623 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxdc\" (UniqueName: \"kubernetes.io/projected/c9a23d62-04c9-41c8-a213-fb9d604a1494-kube-api-access-mtxdc\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.604624 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.605007 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.605210 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.609138 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a6913c-322e-4be4-acd7-29a649757554-kube-api-access-sjldn" (OuterVolumeSpecName: "kube-api-access-sjldn") pod "63a6913c-322e-4be4-acd7-29a649757554" (UID: "63a6913c-322e-4be4-acd7-29a649757554"). InnerVolumeSpecName "kube-api-access-sjldn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.609474 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-config\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.610129 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.611024 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.611431 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.611505 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gw6sm" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.614221 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.614308 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.614420 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.614559 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.614606 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.615354 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.622195 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxdc\" (UniqueName: \"kubernetes.io/projected/c9a23d62-04c9-41c8-a213-fb9d604a1494-kube-api-access-mtxdc\") pod \"dnsmasq-dns-79bd4cc8c9-spjhq\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.633133 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.704638 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.704686 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bb7295b-193b-45b6-8913-8508d190e664-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.704712 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfnz\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-kube-api-access-sbfnz\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705055 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705118 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705191 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705249 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bb7295b-193b-45b6-8913-8508d190e664-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705316 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705362 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705591 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.705760 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjldn\" (UniqueName: \"kubernetes.io/projected/63a6913c-322e-4be4-acd7-29a649757554-kube-api-access-sjldn\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.800269 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807067 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807099 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bb7295b-193b-45b6-8913-8508d190e664-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807122 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfnz\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-kube-api-access-sbfnz\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807183 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807204 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807228 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807252 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bb7295b-193b-45b6-8913-8508d190e664-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807268 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807287 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807307 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807343 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.807953 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.808513 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.809199 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.809404 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.809564 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.809731 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bb7295b-193b-45b6-8913-8508d190e664-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.812548 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.815395 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bb7295b-193b-45b6-8913-8508d190e664-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.815509 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bb7295b-193b-45b6-8913-8508d190e664-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.816644 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.829880 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfnz\" (UniqueName: \"kubernetes.io/projected/5bb7295b-193b-45b6-8913-8508d190e664-kube-api-access-sbfnz\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:05 crc kubenswrapper[4725]: I0225 11:16:05.856105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bb7295b-193b-45b6-8913-8508d190e664\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.148349 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.333808 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-spjhq"] Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.527503 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8cd71ea0-569c-4093-931d-2e0c841bcbf4","Type":"ContainerStarted","Data":"1d547e1172dccb16f832d91fa7152957fecb8ccf693948d90560da45fbb9a595"} Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.539683 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533636-7qvvt" Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.539679 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" event={"ID":"c9a23d62-04c9-41c8-a213-fb9d604a1494","Type":"ContainerStarted","Data":"e4d8fe75e1694b304eb8297e605530abfd274949c9327b7279064d0eba19fe72"} Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.571781 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533630-v7bl7"] Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.579727 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533630-v7bl7"] Feb 25 11:16:06 crc kubenswrapper[4725]: I0225 11:16:06.661395 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 11:16:06 crc kubenswrapper[4725]: W0225 11:16:06.679460 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bb7295b_193b_45b6_8913_8508d190e664.slice/crio-c3297d2dd80fa9fe6c745980cc4eb23f3d46374b13357b0d9f7fa4de1bf073da WatchSource:0}: Error finding container c3297d2dd80fa9fe6c745980cc4eb23f3d46374b13357b0d9f7fa4de1bf073da: Status 404 returned error can't find the container with id c3297d2dd80fa9fe6c745980cc4eb23f3d46374b13357b0d9f7fa4de1bf073da Feb 25 11:16:07 crc kubenswrapper[4725]: I0225 11:16:07.241735 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e6596a-9d15-422f-8436-5c3ea71de9a6" path="/var/lib/kubelet/pods/22e6596a-9d15-422f-8436-5c3ea71de9a6/volumes" Feb 25 11:16:07 crc kubenswrapper[4725]: I0225 11:16:07.244129 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a511fd-4696-456a-8263-da4cd2f5eff1" path="/var/lib/kubelet/pods/d1a511fd-4696-456a-8263-da4cd2f5eff1/volumes" Feb 25 11:16:07 crc kubenswrapper[4725]: I0225 11:16:07.556001 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bb7295b-193b-45b6-8913-8508d190e664","Type":"ContainerStarted","Data":"c3297d2dd80fa9fe6c745980cc4eb23f3d46374b13357b0d9f7fa4de1bf073da"} Feb 25 11:16:07 crc kubenswrapper[4725]: I0225 11:16:07.560467 4725 generic.go:334] "Generic (PLEG): container finished" podID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerID="245ee6443a8400e8f863a2cc43af10fb5c6762d079402996b9d82af920adc14a" exitCode=0 Feb 25 11:16:07 crc kubenswrapper[4725]: I0225 11:16:07.560538 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" event={"ID":"c9a23d62-04c9-41c8-a213-fb9d604a1494","Type":"ContainerDied","Data":"245ee6443a8400e8f863a2cc43af10fb5c6762d079402996b9d82af920adc14a"} Feb 25 11:16:08 crc kubenswrapper[4725]: I0225 11:16:08.574743 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bb7295b-193b-45b6-8913-8508d190e664","Type":"ContainerStarted","Data":"3abf807e6a486c82987753a9b5bfbb667af7a27bfc9a71036dc5f79655f1b691"} Feb 25 11:16:08 crc kubenswrapper[4725]: I0225 11:16:08.577504 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" event={"ID":"c9a23d62-04c9-41c8-a213-fb9d604a1494","Type":"ContainerStarted","Data":"38492d23d6477b8aee035783cffae5cafe54f4f057d05e30d0f77bcaf262bcf9"} Feb 25 11:16:08 crc kubenswrapper[4725]: I0225 11:16:08.577730 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:08 crc kubenswrapper[4725]: I0225 11:16:08.643000 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" podStartSLOduration=3.642970198 podStartE2EDuration="3.642970198s" podCreationTimestamp="2026-02-25 11:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:16:08.6344666 +0000 UTC m=+1394.133048625" watchObservedRunningTime="2026-02-25 11:16:08.642970198 +0000 UTC m=+1394.141552263" Feb 25 11:16:11 crc kubenswrapper[4725]: I0225 11:16:11.556445 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:16:11 crc kubenswrapper[4725]: I0225 11:16:11.557011 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:16:15 crc kubenswrapper[4725]: I0225 11:16:15.803449 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:15 crc kubenswrapper[4725]: I0225 11:16:15.888258 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xmk5s"] Feb 25 11:16:15 crc kubenswrapper[4725]: I0225 11:16:15.888520 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" podUID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerName="dnsmasq-dns" containerID="cri-o://b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd" gracePeriod=10 Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.063169 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-hrfcv"] Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.067396 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.090562 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-hrfcv"] Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.113186 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47xh\" (UniqueName: \"kubernetes.io/projected/f0789964-49e9-49e9-a6f5-133761c0d9f8-kube-api-access-m47xh\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.113652 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.113694 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.113769 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.113855 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.113923 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-dns-svc\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.113985 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-config\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.217947 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.220821 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.222001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-dns-svc\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.218011 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-dns-svc\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.226256 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-config\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.226624 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47xh\" (UniqueName: \"kubernetes.io/projected/f0789964-49e9-49e9-a6f5-133761c0d9f8-kube-api-access-m47xh\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.226720 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.226760 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.226789 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.236086 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-config\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.236668 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.237348 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.237931 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0789964-49e9-49e9-a6f5-133761c0d9f8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.266979 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47xh\" (UniqueName: \"kubernetes.io/projected/f0789964-49e9-49e9-a6f5-133761c0d9f8-kube-api-access-m47xh\") pod \"dnsmasq-dns-55478c4467-hrfcv\" (UID: \"f0789964-49e9-49e9-a6f5-133761c0d9f8\") " pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.400346 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.435155 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.532067 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-swift-storage-0\") pod \"ed310acc-141b-4704-85b7-cc6761c13c0a\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.532445 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-config\") pod \"ed310acc-141b-4704-85b7-cc6761c13c0a\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.532489 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-sb\") pod \"ed310acc-141b-4704-85b7-cc6761c13c0a\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.532647 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2h7b\" (UniqueName: \"kubernetes.io/projected/ed310acc-141b-4704-85b7-cc6761c13c0a-kube-api-access-q2h7b\") pod \"ed310acc-141b-4704-85b7-cc6761c13c0a\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.532687 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-nb\") pod \"ed310acc-141b-4704-85b7-cc6761c13c0a\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.532744 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-svc\") pod \"ed310acc-141b-4704-85b7-cc6761c13c0a\" (UID: \"ed310acc-141b-4704-85b7-cc6761c13c0a\") " Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.542588 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed310acc-141b-4704-85b7-cc6761c13c0a-kube-api-access-q2h7b" (OuterVolumeSpecName: "kube-api-access-q2h7b") pod "ed310acc-141b-4704-85b7-cc6761c13c0a" (UID: "ed310acc-141b-4704-85b7-cc6761c13c0a"). InnerVolumeSpecName "kube-api-access-q2h7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.583546 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed310acc-141b-4704-85b7-cc6761c13c0a" (UID: "ed310acc-141b-4704-85b7-cc6761c13c0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.593450 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-config" (OuterVolumeSpecName: "config") pod "ed310acc-141b-4704-85b7-cc6761c13c0a" (UID: "ed310acc-141b-4704-85b7-cc6761c13c0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.598423 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed310acc-141b-4704-85b7-cc6761c13c0a" (UID: "ed310acc-141b-4704-85b7-cc6761c13c0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.616814 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed310acc-141b-4704-85b7-cc6761c13c0a" (UID: "ed310acc-141b-4704-85b7-cc6761c13c0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.620250 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed310acc-141b-4704-85b7-cc6761c13c0a" (UID: "ed310acc-141b-4704-85b7-cc6761c13c0a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.635305 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2h7b\" (UniqueName: \"kubernetes.io/projected/ed310acc-141b-4704-85b7-cc6761c13c0a-kube-api-access-q2h7b\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.635734 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.635888 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.635999 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.636104 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.636207 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed310acc-141b-4704-85b7-cc6761c13c0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.664803 4725 generic.go:334] "Generic (PLEG): container finished" podID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerID="b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd" exitCode=0 Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.664863 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.664852 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" event={"ID":"ed310acc-141b-4704-85b7-cc6761c13c0a","Type":"ContainerDied","Data":"b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd"} Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.665053 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xmk5s" event={"ID":"ed310acc-141b-4704-85b7-cc6761c13c0a","Type":"ContainerDied","Data":"185f245afcb9e06bc93745de0f7704bdbe1bba118a4691f9ae8622ef54a6a87c"} Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.665124 4725 scope.go:117] "RemoveContainer" containerID="b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.685239 4725 scope.go:117] "RemoveContainer" containerID="c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.708069 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xmk5s"] Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.716766 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xmk5s"] Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.727456 4725 scope.go:117] "RemoveContainer" containerID="b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd" Feb 25 11:16:16 crc kubenswrapper[4725]: E0225 11:16:16.728981 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd\": container with ID starting with b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd not found: ID does not exist" containerID="b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.729024 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd"} err="failed to get container status \"b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd\": rpc error: code = NotFound desc = could not find container \"b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd\": container with ID starting with b864435c705b9530f51aa566c8998ce9f659b734fb10fd1735dc60c1f57ec6fd not found: ID does not exist" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.729052 4725 scope.go:117] "RemoveContainer" containerID="c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72" Feb 25 11:16:16 crc kubenswrapper[4725]: E0225 11:16:16.729471 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72\": container with ID starting with c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72 not found: ID does not exist" containerID="c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.729512 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72"} err="failed to get container status \"c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72\": rpc error: code = NotFound desc = could not find container \"c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72\": container with ID starting with c0f765dd2c294aed50180717b24cf11c1d4f162ebb61424779cca91c10235f72 not found: ID does not exist" Feb 25 11:16:16 crc kubenswrapper[4725]: E0225 11:16:16.783263 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded310acc_141b_4704_85b7_cc6761c13c0a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded310acc_141b_4704_85b7_cc6761c13c0a.slice/crio-185f245afcb9e06bc93745de0f7704bdbe1bba118a4691f9ae8622ef54a6a87c\": RecentStats: unable to find data in memory cache]" Feb 25 11:16:16 crc kubenswrapper[4725]: I0225 11:16:16.879376 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-hrfcv"] Feb 25 11:16:17 crc kubenswrapper[4725]: I0225 11:16:17.065943 4725 scope.go:117] "RemoveContainer" containerID="1d69f749f1434c1c7237a4c7672735e636b9586bade94b08610ec1bbebc6cc47" Feb 25 11:16:17 crc kubenswrapper[4725]: I0225 11:16:17.235771 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed310acc-141b-4704-85b7-cc6761c13c0a" path="/var/lib/kubelet/pods/ed310acc-141b-4704-85b7-cc6761c13c0a/volumes" Feb 25 11:16:17 crc kubenswrapper[4725]: I0225 11:16:17.678134 4725 generic.go:334] "Generic (PLEG): container finished" podID="f0789964-49e9-49e9-a6f5-133761c0d9f8" containerID="2be830ede5356e6a0aa5d5749e59fc898b36a7e22b00102b09a43d14ff614009" exitCode=0 Feb 25 11:16:17 crc kubenswrapper[4725]: I0225 11:16:17.678186 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" event={"ID":"f0789964-49e9-49e9-a6f5-133761c0d9f8","Type":"ContainerDied","Data":"2be830ede5356e6a0aa5d5749e59fc898b36a7e22b00102b09a43d14ff614009"} Feb 25 11:16:17 crc kubenswrapper[4725]: I0225 11:16:17.678215 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" event={"ID":"f0789964-49e9-49e9-a6f5-133761c0d9f8","Type":"ContainerStarted","Data":"f764757fe8fd9943e0cb9e551a5d2ff994afec5643846c208d4a4af472ec2441"} Feb 25 11:16:18 crc kubenswrapper[4725]: I0225 11:16:18.701228 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" event={"ID":"f0789964-49e9-49e9-a6f5-133761c0d9f8","Type":"ContainerStarted","Data":"35f343e33827004b4f74faf1cc4f44de5d6b6889cfda9af6b494f232f93fe897"} Feb 25 11:16:18 crc kubenswrapper[4725]: I0225 11:16:18.703506 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:18 crc kubenswrapper[4725]: I0225 11:16:18.734177 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" podStartSLOduration=2.734146861 podStartE2EDuration="2.734146861s" podCreationTimestamp="2026-02-25 11:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:16:18.723734852 +0000 UTC m=+1404.222316907" watchObservedRunningTime="2026-02-25 11:16:18.734146861 +0000 UTC m=+1404.232728936" Feb 25 11:16:26 crc kubenswrapper[4725]: I0225 11:16:26.402111 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-hrfcv" Feb 25 11:16:26 crc kubenswrapper[4725]: I0225 11:16:26.460583 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-spjhq"] Feb 25 11:16:26 crc kubenswrapper[4725]: I0225 11:16:26.460855 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" podUID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerName="dnsmasq-dns" containerID="cri-o://38492d23d6477b8aee035783cffae5cafe54f4f057d05e30d0f77bcaf262bcf9" gracePeriod=10 Feb 25 11:16:26 crc kubenswrapper[4725]: I0225 11:16:26.820058 4725 generic.go:334] "Generic (PLEG): container finished" podID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerID="38492d23d6477b8aee035783cffae5cafe54f4f057d05e30d0f77bcaf262bcf9" exitCode=0 Feb 25 11:16:26 crc kubenswrapper[4725]: I0225 11:16:26.820100 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" event={"ID":"c9a23d62-04c9-41c8-a213-fb9d604a1494","Type":"ContainerDied","Data":"38492d23d6477b8aee035783cffae5cafe54f4f057d05e30d0f77bcaf262bcf9"} Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.054842 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.159816 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-svc\") pod \"c9a23d62-04c9-41c8-a213-fb9d604a1494\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.159910 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-config\") pod \"c9a23d62-04c9-41c8-a213-fb9d604a1494\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.159939 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-nb\") pod \"c9a23d62-04c9-41c8-a213-fb9d604a1494\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.160017 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-swift-storage-0\") pod \"c9a23d62-04c9-41c8-a213-fb9d604a1494\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.160106 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-openstack-edpm-ipam\") pod \"c9a23d62-04c9-41c8-a213-fb9d604a1494\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.160133 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-sb\") pod \"c9a23d62-04c9-41c8-a213-fb9d604a1494\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.160182 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxdc\" (UniqueName: \"kubernetes.io/projected/c9a23d62-04c9-41c8-a213-fb9d604a1494-kube-api-access-mtxdc\") pod \"c9a23d62-04c9-41c8-a213-fb9d604a1494\" (UID: \"c9a23d62-04c9-41c8-a213-fb9d604a1494\") " Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.170563 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a23d62-04c9-41c8-a213-fb9d604a1494-kube-api-access-mtxdc" (OuterVolumeSpecName: "kube-api-access-mtxdc") pod "c9a23d62-04c9-41c8-a213-fb9d604a1494" (UID: "c9a23d62-04c9-41c8-a213-fb9d604a1494"). InnerVolumeSpecName "kube-api-access-mtxdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.211142 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c9a23d62-04c9-41c8-a213-fb9d604a1494" (UID: "c9a23d62-04c9-41c8-a213-fb9d604a1494"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.211165 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c9a23d62-04c9-41c8-a213-fb9d604a1494" (UID: "c9a23d62-04c9-41c8-a213-fb9d604a1494"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.212040 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c9a23d62-04c9-41c8-a213-fb9d604a1494" (UID: "c9a23d62-04c9-41c8-a213-fb9d604a1494"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.212533 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c9a23d62-04c9-41c8-a213-fb9d604a1494" (UID: "c9a23d62-04c9-41c8-a213-fb9d604a1494"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.213156 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-config" (OuterVolumeSpecName: "config") pod "c9a23d62-04c9-41c8-a213-fb9d604a1494" (UID: "c9a23d62-04c9-41c8-a213-fb9d604a1494"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.214208 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c9a23d62-04c9-41c8-a213-fb9d604a1494" (UID: "c9a23d62-04c9-41c8-a213-fb9d604a1494"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.270780 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.270814 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.270837 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.270847 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtxdc\" (UniqueName: \"kubernetes.io/projected/c9a23d62-04c9-41c8-a213-fb9d604a1494-kube-api-access-mtxdc\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.270880 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.270890 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.270899 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9a23d62-04c9-41c8-a213-fb9d604a1494-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.832210 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" event={"ID":"c9a23d62-04c9-41c8-a213-fb9d604a1494","Type":"ContainerDied","Data":"e4d8fe75e1694b304eb8297e605530abfd274949c9327b7279064d0eba19fe72"} Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.832611 4725 scope.go:117] "RemoveContainer" containerID="38492d23d6477b8aee035783cffae5cafe54f4f057d05e30d0f77bcaf262bcf9" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.832331 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-spjhq" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.866802 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-spjhq"] Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.871668 4725 scope.go:117] "RemoveContainer" containerID="245ee6443a8400e8f863a2cc43af10fb5c6762d079402996b9d82af920adc14a" Feb 25 11:16:27 crc kubenswrapper[4725]: I0225 11:16:27.877531 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-spjhq"] Feb 25 11:16:29 crc kubenswrapper[4725]: I0225 11:16:29.241370 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a23d62-04c9-41c8-a213-fb9d604a1494" path="/var/lib/kubelet/pods/c9a23d62-04c9-41c8-a213-fb9d604a1494/volumes" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.082891 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw"] Feb 25 11:16:35 crc kubenswrapper[4725]: E0225 11:16:35.083814 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerName="dnsmasq-dns" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.084048 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerName="dnsmasq-dns" Feb 25 11:16:35 crc kubenswrapper[4725]: E0225 11:16:35.084075 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerName="dnsmasq-dns" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.084084 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerName="dnsmasq-dns" Feb 25 11:16:35 crc kubenswrapper[4725]: E0225 11:16:35.084104 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerName="init" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.084134 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerName="init" Feb 25 11:16:35 crc kubenswrapper[4725]: E0225 11:16:35.084148 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerName="init" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.084155 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerName="init" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.084370 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a23d62-04c9-41c8-a213-fb9d604a1494" containerName="dnsmasq-dns" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.084411 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed310acc-141b-4704-85b7-cc6761c13c0a" containerName="dnsmasq-dns" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.086416 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.088864 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.089109 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.089243 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.090333 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.097973 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw"] Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.213724 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.213775 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.213813 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.213878 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4kxq\" (UniqueName: \"kubernetes.io/projected/c034211a-1e4c-4636-9f07-a8c4b89bed34-kube-api-access-h4kxq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.315299 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.315462 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4kxq\" (UniqueName: \"kubernetes.io/projected/c034211a-1e4c-4636-9f07-a8c4b89bed34-kube-api-access-h4kxq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.315893 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.315963 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.322192 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.323148 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.336379 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.337181 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4kxq\" (UniqueName: \"kubernetes.io/projected/c034211a-1e4c-4636-9f07-a8c4b89bed34-kube-api-access-h4kxq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.410630 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:35 crc kubenswrapper[4725]: I0225 11:16:35.933374 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw"] Feb 25 11:16:35 crc kubenswrapper[4725]: W0225 11:16:35.940878 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc034211a_1e4c_4636_9f07_a8c4b89bed34.slice/crio-7da0e1da5c4c951354a666f885dc4f27fe17a1022f0d4ca1c563cab2c07fca95 WatchSource:0}: Error finding container 7da0e1da5c4c951354a666f885dc4f27fe17a1022f0d4ca1c563cab2c07fca95: Status 404 returned error can't find the container with id 7da0e1da5c4c951354a666f885dc4f27fe17a1022f0d4ca1c563cab2c07fca95 Feb 25 11:16:36 crc kubenswrapper[4725]: I0225 11:16:36.916999 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" event={"ID":"c034211a-1e4c-4636-9f07-a8c4b89bed34","Type":"ContainerStarted","Data":"7da0e1da5c4c951354a666f885dc4f27fe17a1022f0d4ca1c563cab2c07fca95"} Feb 25 11:16:38 crc kubenswrapper[4725]: I0225 11:16:38.937628 4725 generic.go:334] "Generic (PLEG): container finished" podID="8cd71ea0-569c-4093-931d-2e0c841bcbf4" containerID="1d547e1172dccb16f832d91fa7152957fecb8ccf693948d90560da45fbb9a595" exitCode=0 Feb 25 11:16:38 crc kubenswrapper[4725]: I0225 11:16:38.937734 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8cd71ea0-569c-4093-931d-2e0c841bcbf4","Type":"ContainerDied","Data":"1d547e1172dccb16f832d91fa7152957fecb8ccf693948d90560da45fbb9a595"} Feb 25 11:16:39 crc kubenswrapper[4725]: I0225 11:16:39.951603 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8cd71ea0-569c-4093-931d-2e0c841bcbf4","Type":"ContainerStarted","Data":"1bf9cbbe9576a954a826f9b6182030f09adf8ad0204d85f366ca843fbb3f4034"} Feb 25 11:16:39 crc kubenswrapper[4725]: I0225 11:16:39.952190 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 25 11:16:39 crc kubenswrapper[4725]: I0225 11:16:39.979034 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.979014403 podStartE2EDuration="36.979014403s" podCreationTimestamp="2026-02-25 11:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:16:39.977522363 +0000 UTC m=+1425.476104438" watchObservedRunningTime="2026-02-25 11:16:39.979014403 +0000 UTC m=+1425.477596438" Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.555525 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.556001 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.556074 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.557094 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11e1b1cdb4e476cda22a21020fd383eb9bc627ad8cf9f3e9b918adf3b517b8b4"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.557223 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://11e1b1cdb4e476cda22a21020fd383eb9bc627ad8cf9f3e9b918adf3b517b8b4" gracePeriod=600 Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.978960 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="11e1b1cdb4e476cda22a21020fd383eb9bc627ad8cf9f3e9b918adf3b517b8b4" exitCode=0 Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.979039 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"11e1b1cdb4e476cda22a21020fd383eb9bc627ad8cf9f3e9b918adf3b517b8b4"} Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.979077 4725 scope.go:117] "RemoveContainer" containerID="e9d1cf00d5958f238b464e2eb2f371e000d949ef3901a3f7ece30337723bea95" Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.985000 4725 generic.go:334] "Generic (PLEG): container finished" podID="5bb7295b-193b-45b6-8913-8508d190e664" containerID="3abf807e6a486c82987753a9b5bfbb667af7a27bfc9a71036dc5f79655f1b691" exitCode=0 Feb 25 11:16:41 crc kubenswrapper[4725]: I0225 11:16:41.985043 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bb7295b-193b-45b6-8913-8508d190e664","Type":"ContainerDied","Data":"3abf807e6a486c82987753a9b5bfbb667af7a27bfc9a71036dc5f79655f1b691"} Feb 25 11:16:46 crc kubenswrapper[4725]: I0225 11:16:46.037687 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5"} Feb 25 11:16:46 crc kubenswrapper[4725]: I0225 11:16:46.041307 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" event={"ID":"c034211a-1e4c-4636-9f07-a8c4b89bed34","Type":"ContainerStarted","Data":"70b2ff983fc56c6cbd4d9e22a5c9ea6e7585e1892499489a2ee05ef6cb8f81d1"} Feb 25 11:16:46 crc kubenswrapper[4725]: I0225 11:16:46.044483 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bb7295b-193b-45b6-8913-8508d190e664","Type":"ContainerStarted","Data":"c62d728df061f055308a7223cc66ed5e874b7ffa25f6e6eac028a0a037eed872"} Feb 25 11:16:46 crc kubenswrapper[4725]: I0225 11:16:46.044669 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:46 crc kubenswrapper[4725]: I0225 11:16:46.073261 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" podStartSLOduration=1.744162676 podStartE2EDuration="11.073241707s" podCreationTimestamp="2026-02-25 11:16:35 +0000 UTC" firstStartedPulling="2026-02-25 11:16:35.943165044 +0000 UTC m=+1421.441747069" lastFinishedPulling="2026-02-25 11:16:45.272244075 +0000 UTC m=+1430.770826100" observedRunningTime="2026-02-25 11:16:46.065402517 +0000 UTC m=+1431.563984552" watchObservedRunningTime="2026-02-25 11:16:46.073241707 +0000 UTC m=+1431.571823742" Feb 25 11:16:46 crc kubenswrapper[4725]: I0225 11:16:46.096571 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.09655185 podStartE2EDuration="41.09655185s" podCreationTimestamp="2026-02-25 11:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:16:46.087677203 +0000 UTC m=+1431.586259248" watchObservedRunningTime="2026-02-25 11:16:46.09655185 +0000 UTC m=+1431.595133875" Feb 25 11:16:53 crc kubenswrapper[4725]: I0225 11:16:53.847141 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 25 11:16:56 crc kubenswrapper[4725]: I0225 11:16:56.153056 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 25 11:16:57 crc kubenswrapper[4725]: I0225 11:16:57.143203 4725 generic.go:334] "Generic (PLEG): container finished" podID="c034211a-1e4c-4636-9f07-a8c4b89bed34" containerID="70b2ff983fc56c6cbd4d9e22a5c9ea6e7585e1892499489a2ee05ef6cb8f81d1" exitCode=0 Feb 25 11:16:57 crc kubenswrapper[4725]: I0225 11:16:57.143292 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" event={"ID":"c034211a-1e4c-4636-9f07-a8c4b89bed34","Type":"ContainerDied","Data":"70b2ff983fc56c6cbd4d9e22a5c9ea6e7585e1892499489a2ee05ef6cb8f81d1"} Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.552914 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.578299 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-inventory\") pod \"c034211a-1e4c-4636-9f07-a8c4b89bed34\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.578422 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-repo-setup-combined-ca-bundle\") pod \"c034211a-1e4c-4636-9f07-a8c4b89bed34\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.578621 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4kxq\" (UniqueName: \"kubernetes.io/projected/c034211a-1e4c-4636-9f07-a8c4b89bed34-kube-api-access-h4kxq\") pod \"c034211a-1e4c-4636-9f07-a8c4b89bed34\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.578689 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-ssh-key-openstack-edpm-ipam\") pod \"c034211a-1e4c-4636-9f07-a8c4b89bed34\" (UID: \"c034211a-1e4c-4636-9f07-a8c4b89bed34\") " Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.584650 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c034211a-1e4c-4636-9f07-a8c4b89bed34" (UID: "c034211a-1e4c-4636-9f07-a8c4b89bed34"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.584710 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c034211a-1e4c-4636-9f07-a8c4b89bed34-kube-api-access-h4kxq" (OuterVolumeSpecName: "kube-api-access-h4kxq") pod "c034211a-1e4c-4636-9f07-a8c4b89bed34" (UID: "c034211a-1e4c-4636-9f07-a8c4b89bed34"). InnerVolumeSpecName "kube-api-access-h4kxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.609696 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-inventory" (OuterVolumeSpecName: "inventory") pod "c034211a-1e4c-4636-9f07-a8c4b89bed34" (UID: "c034211a-1e4c-4636-9f07-a8c4b89bed34"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.612990 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c034211a-1e4c-4636-9f07-a8c4b89bed34" (UID: "c034211a-1e4c-4636-9f07-a8c4b89bed34"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.681312 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.681343 4725 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.681354 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4kxq\" (UniqueName: \"kubernetes.io/projected/c034211a-1e4c-4636-9f07-a8c4b89bed34-kube-api-access-h4kxq\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:58 crc kubenswrapper[4725]: I0225 11:16:58.681363 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c034211a-1e4c-4636-9f07-a8c4b89bed34-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.167805 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" event={"ID":"c034211a-1e4c-4636-9f07-a8c4b89bed34","Type":"ContainerDied","Data":"7da0e1da5c4c951354a666f885dc4f27fe17a1022f0d4ca1c563cab2c07fca95"} Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.167917 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7da0e1da5c4c951354a666f885dc4f27fe17a1022f0d4ca1c563cab2c07fca95" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.167922 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.278432 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8"] Feb 25 11:16:59 crc kubenswrapper[4725]: E0225 11:16:59.278908 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c034211a-1e4c-4636-9f07-a8c4b89bed34" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.278931 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c034211a-1e4c-4636-9f07-a8c4b89bed34" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.279224 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c034211a-1e4c-4636-9f07-a8c4b89bed34" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.280406 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.283619 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.288616 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.288616 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.288626 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.304813 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8"] Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.398331 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bplm4\" (UniqueName: \"kubernetes.io/projected/a8206236-adf4-4501-bbc7-6333709aa101-kube-api-access-bplm4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.398516 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.398559 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.500212 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.501236 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.501425 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bplm4\" (UniqueName: \"kubernetes.io/projected/a8206236-adf4-4501-bbc7-6333709aa101-kube-api-access-bplm4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.504951 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.509451 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.534262 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bplm4\" (UniqueName: \"kubernetes.io/projected/a8206236-adf4-4501-bbc7-6333709aa101-kube-api-access-bplm4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mlkj8\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:16:59 crc kubenswrapper[4725]: I0225 11:16:59.601742 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:17:00 crc kubenswrapper[4725]: W0225 11:17:00.233894 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8206236_adf4_4501_bbc7_6333709aa101.slice/crio-fd8934c55320858720d7ed3fa41cb3c5bc712d5fd03c9a1a1f8918bb8adf5316 WatchSource:0}: Error finding container fd8934c55320858720d7ed3fa41cb3c5bc712d5fd03c9a1a1f8918bb8adf5316: Status 404 returned error can't find the container with id fd8934c55320858720d7ed3fa41cb3c5bc712d5fd03c9a1a1f8918bb8adf5316 Feb 25 11:17:00 crc kubenswrapper[4725]: I0225 11:17:00.234511 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8"] Feb 25 11:17:01 crc kubenswrapper[4725]: I0225 11:17:01.191595 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" event={"ID":"a8206236-adf4-4501-bbc7-6333709aa101","Type":"ContainerStarted","Data":"9abdb080fca761136b190ed8c2f0fdfa62643e23000a3eba39458396edc756b2"} Feb 25 11:17:01 crc kubenswrapper[4725]: I0225 11:17:01.191984 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" event={"ID":"a8206236-adf4-4501-bbc7-6333709aa101","Type":"ContainerStarted","Data":"fd8934c55320858720d7ed3fa41cb3c5bc712d5fd03c9a1a1f8918bb8adf5316"} Feb 25 11:17:01 crc kubenswrapper[4725]: I0225 11:17:01.225103 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" podStartSLOduration=1.785148023 podStartE2EDuration="2.225069874s" podCreationTimestamp="2026-02-25 11:16:59 +0000 UTC" firstStartedPulling="2026-02-25 11:17:00.238086399 +0000 UTC m=+1445.736668464" lastFinishedPulling="2026-02-25 11:17:00.67800825 +0000 UTC m=+1446.176590315" observedRunningTime="2026-02-25 11:17:01.220683167 +0000 UTC m=+1446.719265202" watchObservedRunningTime="2026-02-25 11:17:01.225069874 +0000 UTC m=+1446.723651979" Feb 25 11:17:04 crc kubenswrapper[4725]: I0225 11:17:04.224262 4725 generic.go:334] "Generic (PLEG): container finished" podID="a8206236-adf4-4501-bbc7-6333709aa101" containerID="9abdb080fca761136b190ed8c2f0fdfa62643e23000a3eba39458396edc756b2" exitCode=0 Feb 25 11:17:04 crc kubenswrapper[4725]: I0225 11:17:04.224382 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" event={"ID":"a8206236-adf4-4501-bbc7-6333709aa101","Type":"ContainerDied","Data":"9abdb080fca761136b190ed8c2f0fdfa62643e23000a3eba39458396edc756b2"} Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.764658 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.872720 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-inventory\") pod \"a8206236-adf4-4501-bbc7-6333709aa101\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.872916 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-ssh-key-openstack-edpm-ipam\") pod \"a8206236-adf4-4501-bbc7-6333709aa101\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.873002 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bplm4\" (UniqueName: \"kubernetes.io/projected/a8206236-adf4-4501-bbc7-6333709aa101-kube-api-access-bplm4\") pod \"a8206236-adf4-4501-bbc7-6333709aa101\" (UID: \"a8206236-adf4-4501-bbc7-6333709aa101\") " Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.881919 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8206236-adf4-4501-bbc7-6333709aa101-kube-api-access-bplm4" (OuterVolumeSpecName: "kube-api-access-bplm4") pod "a8206236-adf4-4501-bbc7-6333709aa101" (UID: "a8206236-adf4-4501-bbc7-6333709aa101"). InnerVolumeSpecName "kube-api-access-bplm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.901355 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-inventory" (OuterVolumeSpecName: "inventory") pod "a8206236-adf4-4501-bbc7-6333709aa101" (UID: "a8206236-adf4-4501-bbc7-6333709aa101"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.916798 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a8206236-adf4-4501-bbc7-6333709aa101" (UID: "a8206236-adf4-4501-bbc7-6333709aa101"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.976159 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.976217 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bplm4\" (UniqueName: \"kubernetes.io/projected/a8206236-adf4-4501-bbc7-6333709aa101-kube-api-access-bplm4\") on node \"crc\" DevicePath \"\"" Feb 25 11:17:05 crc kubenswrapper[4725]: I0225 11:17:05.976237 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8206236-adf4-4501-bbc7-6333709aa101-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.246271 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" event={"ID":"a8206236-adf4-4501-bbc7-6333709aa101","Type":"ContainerDied","Data":"fd8934c55320858720d7ed3fa41cb3c5bc712d5fd03c9a1a1f8918bb8adf5316"} Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.246308 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd8934c55320858720d7ed3fa41cb3c5bc712d5fd03c9a1a1f8918bb8adf5316" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.246393 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mlkj8" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.358475 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl"] Feb 25 11:17:06 crc kubenswrapper[4725]: E0225 11:17:06.359117 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8206236-adf4-4501-bbc7-6333709aa101" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.359149 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8206236-adf4-4501-bbc7-6333709aa101" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.359443 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8206236-adf4-4501-bbc7-6333709aa101" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.360435 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.363653 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.364154 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.364172 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.368734 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.377660 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl"] Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.494315 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45lgk\" (UniqueName: \"kubernetes.io/projected/a1b2db62-0e44-475c-bd55-aeceb2068aed-kube-api-access-45lgk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.494454 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.494800 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.495034 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.597665 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45lgk\" (UniqueName: \"kubernetes.io/projected/a1b2db62-0e44-475c-bd55-aeceb2068aed-kube-api-access-45lgk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.597779 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.598000 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.598094 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.603480 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.606618 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.606651 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.622819 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45lgk\" (UniqueName: \"kubernetes.io/projected/a1b2db62-0e44-475c-bd55-aeceb2068aed-kube-api-access-45lgk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:06 crc kubenswrapper[4725]: I0225 11:17:06.685310 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:17:07 crc kubenswrapper[4725]: I0225 11:17:07.249233 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl"] Feb 25 11:17:07 crc kubenswrapper[4725]: W0225 11:17:07.259070 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1b2db62_0e44_475c_bd55_aeceb2068aed.slice/crio-2c681e1d102f2b23814f04faf3944d2dfb76b6dab67bf02f5a7228125d5daa28 WatchSource:0}: Error finding container 2c681e1d102f2b23814f04faf3944d2dfb76b6dab67bf02f5a7228125d5daa28: Status 404 returned error can't find the container with id 2c681e1d102f2b23814f04faf3944d2dfb76b6dab67bf02f5a7228125d5daa28 Feb 25 11:17:08 crc kubenswrapper[4725]: I0225 11:17:08.264268 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" event={"ID":"a1b2db62-0e44-475c-bd55-aeceb2068aed","Type":"ContainerStarted","Data":"8a561313050d9f5994213166d0808e17a8157d5c53e70298b5de47837e97a83c"} Feb 25 11:17:08 crc kubenswrapper[4725]: I0225 11:17:08.265130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" event={"ID":"a1b2db62-0e44-475c-bd55-aeceb2068aed","Type":"ContainerStarted","Data":"2c681e1d102f2b23814f04faf3944d2dfb76b6dab67bf02f5a7228125d5daa28"} Feb 25 11:17:17 crc kubenswrapper[4725]: I0225 11:17:17.298177 4725 scope.go:117] "RemoveContainer" containerID="d03c3df2831f435eec79aa4c11fd77f615f21dd1d257dba5c76fc719b708a1de" Feb 25 11:17:17 crc kubenswrapper[4725]: I0225 11:17:17.322988 4725 scope.go:117] "RemoveContainer" containerID="2af8e6e7a562b7ee4af79f43dae1477fc3030c2af6ebca94387b740f4bd7db9a" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.165970 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" podStartSLOduration=36.593035399 podStartE2EDuration="37.165951985s" podCreationTimestamp="2026-02-25 11:17:06 +0000 UTC" firstStartedPulling="2026-02-25 11:17:07.261727649 +0000 UTC m=+1452.760309674" lastFinishedPulling="2026-02-25 11:17:07.834644215 +0000 UTC m=+1453.333226260" observedRunningTime="2026-02-25 11:17:08.294073637 +0000 UTC m=+1453.792655682" watchObservedRunningTime="2026-02-25 11:17:43.165951985 +0000 UTC m=+1488.664534020" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.167423 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5mr86"] Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.170120 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.212382 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mr86"] Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.255227 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d4b15a-99ab-4671-bc50-6790e38d355c-utilities\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.255309 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d4b15a-99ab-4671-bc50-6790e38d355c-catalog-content\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.255345 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7k4m\" (UniqueName: \"kubernetes.io/projected/12d4b15a-99ab-4671-bc50-6790e38d355c-kube-api-access-x7k4m\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.357040 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d4b15a-99ab-4671-bc50-6790e38d355c-utilities\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.357157 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d4b15a-99ab-4671-bc50-6790e38d355c-catalog-content\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.357194 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7k4m\" (UniqueName: \"kubernetes.io/projected/12d4b15a-99ab-4671-bc50-6790e38d355c-kube-api-access-x7k4m\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.357655 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d4b15a-99ab-4671-bc50-6790e38d355c-utilities\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.357686 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d4b15a-99ab-4671-bc50-6790e38d355c-catalog-content\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.380547 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7k4m\" (UniqueName: \"kubernetes.io/projected/12d4b15a-99ab-4671-bc50-6790e38d355c-kube-api-access-x7k4m\") pod \"community-operators-5mr86\" (UID: \"12d4b15a-99ab-4671-bc50-6790e38d355c\") " pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:43 crc kubenswrapper[4725]: I0225 11:17:43.504602 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:17:44 crc kubenswrapper[4725]: I0225 11:17:44.006356 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mr86"] Feb 25 11:17:44 crc kubenswrapper[4725]: I0225 11:17:44.686679 4725 generic.go:334] "Generic (PLEG): container finished" podID="12d4b15a-99ab-4671-bc50-6790e38d355c" containerID="9c0a715f33a934b9e4546058651ba9c8516c23dd5785e272ac2cf1a0e59fe490" exitCode=0 Feb 25 11:17:44 crc kubenswrapper[4725]: I0225 11:17:44.686726 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mr86" event={"ID":"12d4b15a-99ab-4671-bc50-6790e38d355c","Type":"ContainerDied","Data":"9c0a715f33a934b9e4546058651ba9c8516c23dd5785e272ac2cf1a0e59fe490"} Feb 25 11:17:44 crc kubenswrapper[4725]: I0225 11:17:44.686998 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mr86" event={"ID":"12d4b15a-99ab-4671-bc50-6790e38d355c","Type":"ContainerStarted","Data":"f538bf9d7e70e3fd32d51f84e85634f1647165e20b348e525b4d02736b2caba4"} Feb 25 11:17:44 crc kubenswrapper[4725]: I0225 11:17:44.688300 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:17:49 crc kubenswrapper[4725]: I0225 11:17:49.735267 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mr86" event={"ID":"12d4b15a-99ab-4671-bc50-6790e38d355c","Type":"ContainerStarted","Data":"de6bd8d6e2d44a0c5de6eaaa7567d392b54e1715d11354f439324b21ddd2125e"} Feb 25 11:17:50 crc kubenswrapper[4725]: I0225 11:17:50.751530 4725 generic.go:334] "Generic (PLEG): container finished" podID="12d4b15a-99ab-4671-bc50-6790e38d355c" containerID="de6bd8d6e2d44a0c5de6eaaa7567d392b54e1715d11354f439324b21ddd2125e" exitCode=0 Feb 25 11:17:50 crc kubenswrapper[4725]: I0225 11:17:50.751634 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mr86" event={"ID":"12d4b15a-99ab-4671-bc50-6790e38d355c","Type":"ContainerDied","Data":"de6bd8d6e2d44a0c5de6eaaa7567d392b54e1715d11354f439324b21ddd2125e"} Feb 25 11:17:55 crc kubenswrapper[4725]: I0225 11:17:55.817347 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mr86" event={"ID":"12d4b15a-99ab-4671-bc50-6790e38d355c","Type":"ContainerStarted","Data":"ff9a99ded8ebf8a3bb0fc75426f295d2c83693ae45de39026da4d6d96683d07e"} Feb 25 11:17:55 crc kubenswrapper[4725]: I0225 11:17:55.863760 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5mr86" podStartSLOduration=2.240045814 podStartE2EDuration="12.863741436s" podCreationTimestamp="2026-02-25 11:17:43 +0000 UTC" firstStartedPulling="2026-02-25 11:17:44.688068936 +0000 UTC m=+1490.186650961" lastFinishedPulling="2026-02-25 11:17:55.311764558 +0000 UTC m=+1500.810346583" observedRunningTime="2026-02-25 11:17:55.855097476 +0000 UTC m=+1501.353679501" watchObservedRunningTime="2026-02-25 11:17:55.863741436 +0000 UTC m=+1501.362323461" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.157528 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533638-fgdv8"] Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.160172 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533638-fgdv8" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.163412 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.164048 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.164309 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.177548 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533638-fgdv8"] Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.248198 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljnc5\" (UniqueName: \"kubernetes.io/projected/4a382086-c357-46af-83de-2b0e8cfeb4cc-kube-api-access-ljnc5\") pod \"auto-csr-approver-29533638-fgdv8\" (UID: \"4a382086-c357-46af-83de-2b0e8cfeb4cc\") " pod="openshift-infra/auto-csr-approver-29533638-fgdv8" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.350058 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljnc5\" (UniqueName: \"kubernetes.io/projected/4a382086-c357-46af-83de-2b0e8cfeb4cc-kube-api-access-ljnc5\") pod \"auto-csr-approver-29533638-fgdv8\" (UID: \"4a382086-c357-46af-83de-2b0e8cfeb4cc\") " pod="openshift-infra/auto-csr-approver-29533638-fgdv8" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.374942 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljnc5\" (UniqueName: \"kubernetes.io/projected/4a382086-c357-46af-83de-2b0e8cfeb4cc-kube-api-access-ljnc5\") pod \"auto-csr-approver-29533638-fgdv8\" (UID: \"4a382086-c357-46af-83de-2b0e8cfeb4cc\") " pod="openshift-infra/auto-csr-approver-29533638-fgdv8" Feb 25 11:18:00 crc kubenswrapper[4725]: I0225 11:18:00.501971 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533638-fgdv8" Feb 25 11:18:01 crc kubenswrapper[4725]: I0225 11:18:01.056508 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533638-fgdv8"] Feb 25 11:18:01 crc kubenswrapper[4725]: I0225 11:18:01.899201 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533638-fgdv8" event={"ID":"4a382086-c357-46af-83de-2b0e8cfeb4cc","Type":"ContainerStarted","Data":"49436e032c25668d11410272b39a5d01fcfb5ec260b1e9c0a848814301f86bf1"} Feb 25 11:18:02 crc kubenswrapper[4725]: I0225 11:18:02.915667 4725 generic.go:334] "Generic (PLEG): container finished" podID="4a382086-c357-46af-83de-2b0e8cfeb4cc" containerID="b15d2fb9893207719ff5fbd160c98448cd731de7d9e7eecd467fafaf7e6d64fa" exitCode=0 Feb 25 11:18:02 crc kubenswrapper[4725]: I0225 11:18:02.915756 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533638-fgdv8" event={"ID":"4a382086-c357-46af-83de-2b0e8cfeb4cc","Type":"ContainerDied","Data":"b15d2fb9893207719ff5fbd160c98448cd731de7d9e7eecd467fafaf7e6d64fa"} Feb 25 11:18:03 crc kubenswrapper[4725]: I0225 11:18:03.505159 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:18:03 crc kubenswrapper[4725]: I0225 11:18:03.505532 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:18:03 crc kubenswrapper[4725]: I0225 11:18:03.582218 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.005421 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5mr86" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.105715 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mr86"] Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.165748 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-889vj"] Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.166145 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-889vj" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="registry-server" containerID="cri-o://aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841" gracePeriod=2 Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.359551 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533638-fgdv8" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.558194 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljnc5\" (UniqueName: \"kubernetes.io/projected/4a382086-c357-46af-83de-2b0e8cfeb4cc-kube-api-access-ljnc5\") pod \"4a382086-c357-46af-83de-2b0e8cfeb4cc\" (UID: \"4a382086-c357-46af-83de-2b0e8cfeb4cc\") " Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.568230 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a382086-c357-46af-83de-2b0e8cfeb4cc-kube-api-access-ljnc5" (OuterVolumeSpecName: "kube-api-access-ljnc5") pod "4a382086-c357-46af-83de-2b0e8cfeb4cc" (UID: "4a382086-c357-46af-83de-2b0e8cfeb4cc"). InnerVolumeSpecName "kube-api-access-ljnc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.616939 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.660806 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59wj9\" (UniqueName: \"kubernetes.io/projected/16868507-af62-4b1b-bf7c-317fe4e2c94e-kube-api-access-59wj9\") pod \"16868507-af62-4b1b-bf7c-317fe4e2c94e\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.660942 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-catalog-content\") pod \"16868507-af62-4b1b-bf7c-317fe4e2c94e\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.661237 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-utilities\") pod \"16868507-af62-4b1b-bf7c-317fe4e2c94e\" (UID: \"16868507-af62-4b1b-bf7c-317fe4e2c94e\") " Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.661787 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljnc5\" (UniqueName: \"kubernetes.io/projected/4a382086-c357-46af-83de-2b0e8cfeb4cc-kube-api-access-ljnc5\") on node \"crc\" DevicePath \"\"" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.661916 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-utilities" (OuterVolumeSpecName: "utilities") pod "16868507-af62-4b1b-bf7c-317fe4e2c94e" (UID: "16868507-af62-4b1b-bf7c-317fe4e2c94e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.666181 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16868507-af62-4b1b-bf7c-317fe4e2c94e-kube-api-access-59wj9" (OuterVolumeSpecName: "kube-api-access-59wj9") pod "16868507-af62-4b1b-bf7c-317fe4e2c94e" (UID: "16868507-af62-4b1b-bf7c-317fe4e2c94e"). InnerVolumeSpecName "kube-api-access-59wj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.724808 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16868507-af62-4b1b-bf7c-317fe4e2c94e" (UID: "16868507-af62-4b1b-bf7c-317fe4e2c94e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.763495 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.763549 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59wj9\" (UniqueName: \"kubernetes.io/projected/16868507-af62-4b1b-bf7c-317fe4e2c94e-kube-api-access-59wj9\") on node \"crc\" DevicePath \"\"" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.763572 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16868507-af62-4b1b-bf7c-317fe4e2c94e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.941098 4725 generic.go:334] "Generic (PLEG): container finished" podID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerID="aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841" exitCode=0 Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.941196 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889vj" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.941196 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889vj" event={"ID":"16868507-af62-4b1b-bf7c-317fe4e2c94e","Type":"ContainerDied","Data":"aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841"} Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.941332 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889vj" event={"ID":"16868507-af62-4b1b-bf7c-317fe4e2c94e","Type":"ContainerDied","Data":"4c302fd06c12002c6de93388d1e1d55da1c88bd61781f430af30231396dce41c"} Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.941351 4725 scope.go:117] "RemoveContainer" containerID="aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.944338 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533638-fgdv8" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.944880 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533638-fgdv8" event={"ID":"4a382086-c357-46af-83de-2b0e8cfeb4cc","Type":"ContainerDied","Data":"49436e032c25668d11410272b39a5d01fcfb5ec260b1e9c0a848814301f86bf1"} Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.944900 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49436e032c25668d11410272b39a5d01fcfb5ec260b1e9c0a848814301f86bf1" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.991288 4725 scope.go:117] "RemoveContainer" containerID="063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b" Feb 25 11:18:04 crc kubenswrapper[4725]: I0225 11:18:04.992271 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-889vj"] Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.004966 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-889vj"] Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.024026 4725 scope.go:117] "RemoveContainer" containerID="e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.057077 4725 scope.go:117] "RemoveContainer" containerID="aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841" Feb 25 11:18:05 crc kubenswrapper[4725]: E0225 11:18:05.057798 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841\": container with ID starting with aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841 not found: ID does not exist" containerID="aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.058004 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841"} err="failed to get container status \"aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841\": rpc error: code = NotFound desc = could not find container \"aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841\": container with ID starting with aab2e0d12da7f66227024b66ed6ccfcbd0220977fecb3af5408773a60d8ff841 not found: ID does not exist" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.058147 4725 scope.go:117] "RemoveContainer" containerID="063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b" Feb 25 11:18:05 crc kubenswrapper[4725]: E0225 11:18:05.058678 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b\": container with ID starting with 063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b not found: ID does not exist" containerID="063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.058718 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b"} err="failed to get container status \"063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b\": rpc error: code = NotFound desc = could not find container \"063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b\": container with ID starting with 063f0a3df6894c200bfd0e2d5bcfdc7af7288e178c1e0ce6edcc0123d7203c6b not found: ID does not exist" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.058744 4725 scope.go:117] "RemoveContainer" containerID="e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab" Feb 25 11:18:05 crc kubenswrapper[4725]: E0225 11:18:05.058990 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab\": container with ID starting with e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab not found: ID does not exist" containerID="e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.059019 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab"} err="failed to get container status \"e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab\": rpc error: code = NotFound desc = could not find container \"e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab\": container with ID starting with e248c2acebdc25c3240da0a25ce07e89bc27aee8161cb84f782b90eb008b10ab not found: ID does not exist" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.236707 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" path="/var/lib/kubelet/pods/16868507-af62-4b1b-bf7c-317fe4e2c94e/volumes" Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.434022 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533632-msl4k"] Feb 25 11:18:05 crc kubenswrapper[4725]: I0225 11:18:05.444984 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533632-msl4k"] Feb 25 11:18:07 crc kubenswrapper[4725]: I0225 11:18:07.236032 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b720fd7-adf3-460d-a61b-832c8c974dc0" path="/var/lib/kubelet/pods/0b720fd7-adf3-460d-a61b-832c8c974dc0/volumes" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.416933 4725 scope.go:117] "RemoveContainer" containerID="c1e2b612ea1bf2ce371b8fd8ec86f336fdff9025bb674d976c721485aa505dbb" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.456854 4725 scope.go:117] "RemoveContainer" containerID="f1d7f096c6457b9c43337ba45d7ab3a88bf8346cec3e8b1587f647a6b5aff2af" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.491157 4725 scope.go:117] "RemoveContainer" containerID="3d8cbaae69be3731abeeaddc0ffa201ef89d35ac7a5041c913b8bf86d8bc2854" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.551860 4725 scope.go:117] "RemoveContainer" containerID="cc458f72993980725388a3b3f0c97c2fe01765ceeda69dec8ffe26f437197b33" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.587149 4725 scope.go:117] "RemoveContainer" containerID="e534b69731edb04a186d0e470c9dd4206bc4dc71418fa371203f3a49f8f4ed68" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.648609 4725 scope.go:117] "RemoveContainer" containerID="8c2a5a7f2c12e174eb0d21615a7ac8d84e21b646284bb0b0a912c6ca48a88f8f" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.679465 4725 scope.go:117] "RemoveContainer" containerID="742c379fd1cce6dd119e7b9543d95d51a4735d72b9453b172fd15b8c71a2bb4f" Feb 25 11:18:17 crc kubenswrapper[4725]: I0225 11:18:17.716729 4725 scope.go:117] "RemoveContainer" containerID="3115750fe611e697d5a909af4e3a31bf48b8f0e2ea525b29b833d0b1345582d5" Feb 25 11:19:11 crc kubenswrapper[4725]: I0225 11:19:11.555888 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:19:11 crc kubenswrapper[4725]: I0225 11:19:11.556559 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:19:17 crc kubenswrapper[4725]: I0225 11:19:17.869337 4725 scope.go:117] "RemoveContainer" containerID="01fe3b1ee2f8aa8ca4385d279b32ba554348f15c838f6ba17a89bae0bc2fb4a5" Feb 25 11:19:17 crc kubenswrapper[4725]: I0225 11:19:17.891775 4725 scope.go:117] "RemoveContainer" containerID="8b367172e8919f938670f03a6303378703dfbba29b2de04882da1c7955816207" Feb 25 11:19:41 crc kubenswrapper[4725]: I0225 11:19:41.555486 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:19:41 crc kubenswrapper[4725]: I0225 11:19:41.556246 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.892708 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5qcjn"] Feb 25 11:19:42 crc kubenswrapper[4725]: E0225 11:19:42.894988 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="extract-utilities" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.895034 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="extract-utilities" Feb 25 11:19:42 crc kubenswrapper[4725]: E0225 11:19:42.895062 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="registry-server" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.895075 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="registry-server" Feb 25 11:19:42 crc kubenswrapper[4725]: E0225 11:19:42.895114 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="extract-content" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.895128 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="extract-content" Feb 25 11:19:42 crc kubenswrapper[4725]: E0225 11:19:42.895150 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a382086-c357-46af-83de-2b0e8cfeb4cc" containerName="oc" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.895162 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a382086-c357-46af-83de-2b0e8cfeb4cc" containerName="oc" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.895508 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a382086-c357-46af-83de-2b0e8cfeb4cc" containerName="oc" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.895531 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="16868507-af62-4b1b-bf7c-317fe4e2c94e" containerName="registry-server" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.898319 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:42 crc kubenswrapper[4725]: I0225 11:19:42.920340 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qcjn"] Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.091971 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqfn\" (UniqueName: \"kubernetes.io/projected/e6cae2f0-a453-4259-ab63-659928874d51-kube-api-access-8nqfn\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.092310 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-utilities\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.092533 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-catalog-content\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.194195 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqfn\" (UniqueName: \"kubernetes.io/projected/e6cae2f0-a453-4259-ab63-659928874d51-kube-api-access-8nqfn\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.194294 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-utilities\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.194357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-catalog-content\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.195174 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-utilities\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.195285 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-catalog-content\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.229921 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqfn\" (UniqueName: \"kubernetes.io/projected/e6cae2f0-a453-4259-ab63-659928874d51-kube-api-access-8nqfn\") pod \"redhat-marketplace-5qcjn\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.524964 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:43 crc kubenswrapper[4725]: I0225 11:19:43.993417 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qcjn"] Feb 25 11:19:44 crc kubenswrapper[4725]: W0225 11:19:44.014693 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6cae2f0_a453_4259_ab63_659928874d51.slice/crio-6557123ad73d145165fc9c67c58de9cd9e7eb0745341004efe0c9a96588e2e27 WatchSource:0}: Error finding container 6557123ad73d145165fc9c67c58de9cd9e7eb0745341004efe0c9a96588e2e27: Status 404 returned error can't find the container with id 6557123ad73d145165fc9c67c58de9cd9e7eb0745341004efe0c9a96588e2e27 Feb 25 11:19:44 crc kubenswrapper[4725]: I0225 11:19:44.887470 4725 generic.go:334] "Generic (PLEG): container finished" podID="e6cae2f0-a453-4259-ab63-659928874d51" containerID="4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6" exitCode=0 Feb 25 11:19:44 crc kubenswrapper[4725]: I0225 11:19:44.887579 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qcjn" event={"ID":"e6cae2f0-a453-4259-ab63-659928874d51","Type":"ContainerDied","Data":"4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6"} Feb 25 11:19:44 crc kubenswrapper[4725]: I0225 11:19:44.887987 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qcjn" event={"ID":"e6cae2f0-a453-4259-ab63-659928874d51","Type":"ContainerStarted","Data":"6557123ad73d145165fc9c67c58de9cd9e7eb0745341004efe0c9a96588e2e27"} Feb 25 11:19:46 crc kubenswrapper[4725]: I0225 11:19:46.918426 4725 generic.go:334] "Generic (PLEG): container finished" podID="e6cae2f0-a453-4259-ab63-659928874d51" containerID="c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858" exitCode=0 Feb 25 11:19:46 crc kubenswrapper[4725]: I0225 11:19:46.918507 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qcjn" event={"ID":"e6cae2f0-a453-4259-ab63-659928874d51","Type":"ContainerDied","Data":"c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858"} Feb 25 11:19:47 crc kubenswrapper[4725]: I0225 11:19:47.935134 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qcjn" event={"ID":"e6cae2f0-a453-4259-ab63-659928874d51","Type":"ContainerStarted","Data":"f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904"} Feb 25 11:19:47 crc kubenswrapper[4725]: I0225 11:19:47.982084 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5qcjn" podStartSLOduration=3.322335312 podStartE2EDuration="5.982022612s" podCreationTimestamp="2026-02-25 11:19:42 +0000 UTC" firstStartedPulling="2026-02-25 11:19:44.88945683 +0000 UTC m=+1610.388038885" lastFinishedPulling="2026-02-25 11:19:47.54914412 +0000 UTC m=+1613.047726185" observedRunningTime="2026-02-25 11:19:47.970913146 +0000 UTC m=+1613.469495221" watchObservedRunningTime="2026-02-25 11:19:47.982022612 +0000 UTC m=+1613.480604677" Feb 25 11:19:53 crc kubenswrapper[4725]: I0225 11:19:53.525708 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:53 crc kubenswrapper[4725]: I0225 11:19:53.526258 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:53 crc kubenswrapper[4725]: I0225 11:19:53.618920 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:54 crc kubenswrapper[4725]: I0225 11:19:54.081421 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:54 crc kubenswrapper[4725]: I0225 11:19:54.146039 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qcjn"] Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.026618 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5qcjn" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="registry-server" containerID="cri-o://f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904" gracePeriod=2 Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.470339 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.562044 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-utilities\") pod \"e6cae2f0-a453-4259-ab63-659928874d51\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.562209 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-catalog-content\") pod \"e6cae2f0-a453-4259-ab63-659928874d51\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.562321 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nqfn\" (UniqueName: \"kubernetes.io/projected/e6cae2f0-a453-4259-ab63-659928874d51-kube-api-access-8nqfn\") pod \"e6cae2f0-a453-4259-ab63-659928874d51\" (UID: \"e6cae2f0-a453-4259-ab63-659928874d51\") " Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.563510 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-utilities" (OuterVolumeSpecName: "utilities") pod "e6cae2f0-a453-4259-ab63-659928874d51" (UID: "e6cae2f0-a453-4259-ab63-659928874d51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.569185 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cae2f0-a453-4259-ab63-659928874d51-kube-api-access-8nqfn" (OuterVolumeSpecName: "kube-api-access-8nqfn") pod "e6cae2f0-a453-4259-ab63-659928874d51" (UID: "e6cae2f0-a453-4259-ab63-659928874d51"). InnerVolumeSpecName "kube-api-access-8nqfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.664395 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nqfn\" (UniqueName: \"kubernetes.io/projected/e6cae2f0-a453-4259-ab63-659928874d51-kube-api-access-8nqfn\") on node \"crc\" DevicePath \"\"" Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.664434 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.720018 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6cae2f0-a453-4259-ab63-659928874d51" (UID: "e6cae2f0-a453-4259-ab63-659928874d51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:19:56 crc kubenswrapper[4725]: I0225 11:19:56.766347 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cae2f0-a453-4259-ab63-659928874d51-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.040214 4725 generic.go:334] "Generic (PLEG): container finished" podID="e6cae2f0-a453-4259-ab63-659928874d51" containerID="f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904" exitCode=0 Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.040255 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qcjn" event={"ID":"e6cae2f0-a453-4259-ab63-659928874d51","Type":"ContainerDied","Data":"f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904"} Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.040291 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qcjn" event={"ID":"e6cae2f0-a453-4259-ab63-659928874d51","Type":"ContainerDied","Data":"6557123ad73d145165fc9c67c58de9cd9e7eb0745341004efe0c9a96588e2e27"} Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.040310 4725 scope.go:117] "RemoveContainer" containerID="f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.040317 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qcjn" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.068683 4725 scope.go:117] "RemoveContainer" containerID="c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.102806 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qcjn"] Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.110031 4725 scope.go:117] "RemoveContainer" containerID="4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.111233 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qcjn"] Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.147815 4725 scope.go:117] "RemoveContainer" containerID="f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904" Feb 25 11:19:57 crc kubenswrapper[4725]: E0225 11:19:57.148374 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904\": container with ID starting with f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904 not found: ID does not exist" containerID="f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.148410 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904"} err="failed to get container status \"f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904\": rpc error: code = NotFound desc = could not find container \"f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904\": container with ID starting with f7d229cb1a65825f05df6c87273b8f13eb0159e7c25a1a548c95454c1ada8904 not found: ID does not exist" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.148436 4725 scope.go:117] "RemoveContainer" containerID="c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858" Feb 25 11:19:57 crc kubenswrapper[4725]: E0225 11:19:57.148774 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858\": container with ID starting with c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858 not found: ID does not exist" containerID="c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.148802 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858"} err="failed to get container status \"c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858\": rpc error: code = NotFound desc = could not find container \"c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858\": container with ID starting with c5ffefb597451db4e70fd400227ef849f1a5e4b80556900310070040420ca858 not found: ID does not exist" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.148821 4725 scope.go:117] "RemoveContainer" containerID="4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6" Feb 25 11:19:57 crc kubenswrapper[4725]: E0225 11:19:57.149188 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6\": container with ID starting with 4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6 not found: ID does not exist" containerID="4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.149214 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6"} err="failed to get container status \"4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6\": rpc error: code = NotFound desc = could not find container \"4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6\": container with ID starting with 4329753dadff2487cf8c9dba8cffcc3822b7dab9ffcf9fa27263f20ce3355bd6 not found: ID does not exist" Feb 25 11:19:57 crc kubenswrapper[4725]: I0225 11:19:57.239619 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cae2f0-a453-4259-ab63-659928874d51" path="/var/lib/kubelet/pods/e6cae2f0-a453-4259-ab63-659928874d51/volumes" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.181429 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533640-q8ps4"] Feb 25 11:20:00 crc kubenswrapper[4725]: E0225 11:20:00.182294 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="extract-content" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.182308 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="extract-content" Feb 25 11:20:00 crc kubenswrapper[4725]: E0225 11:20:00.182318 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="extract-utilities" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.182327 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="extract-utilities" Feb 25 11:20:00 crc kubenswrapper[4725]: E0225 11:20:00.182366 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="registry-server" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.182375 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="registry-server" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.182618 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cae2f0-a453-4259-ab63-659928874d51" containerName="registry-server" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.183328 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533640-q8ps4" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.185491 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.185878 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.186603 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.195631 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533640-q8ps4"] Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.365868 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nnb7\" (UniqueName: \"kubernetes.io/projected/4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d-kube-api-access-4nnb7\") pod \"auto-csr-approver-29533640-q8ps4\" (UID: \"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d\") " pod="openshift-infra/auto-csr-approver-29533640-q8ps4" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.468288 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nnb7\" (UniqueName: \"kubernetes.io/projected/4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d-kube-api-access-4nnb7\") pod \"auto-csr-approver-29533640-q8ps4\" (UID: \"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d\") " pod="openshift-infra/auto-csr-approver-29533640-q8ps4" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.492210 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nnb7\" (UniqueName: \"kubernetes.io/projected/4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d-kube-api-access-4nnb7\") pod \"auto-csr-approver-29533640-q8ps4\" (UID: \"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d\") " pod="openshift-infra/auto-csr-approver-29533640-q8ps4" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.504347 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533640-q8ps4" Feb 25 11:20:00 crc kubenswrapper[4725]: I0225 11:20:00.995745 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533640-q8ps4"] Feb 25 11:20:01 crc kubenswrapper[4725]: I0225 11:20:01.092798 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533640-q8ps4" event={"ID":"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d","Type":"ContainerStarted","Data":"5866d8d26c2c599df87359cd44ee587fcd448d381fc1aeadd0d4008de5be57ae"} Feb 25 11:20:03 crc kubenswrapper[4725]: I0225 11:20:03.122100 4725 generic.go:334] "Generic (PLEG): container finished" podID="4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d" containerID="99e302b9e980f477bf5fa5dfb9f14b3e2ec114c5a5908f015ef8bbe7d8463d06" exitCode=0 Feb 25 11:20:03 crc kubenswrapper[4725]: I0225 11:20:03.122188 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533640-q8ps4" event={"ID":"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d","Type":"ContainerDied","Data":"99e302b9e980f477bf5fa5dfb9f14b3e2ec114c5a5908f015ef8bbe7d8463d06"} Feb 25 11:20:04 crc kubenswrapper[4725]: I0225 11:20:04.520324 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533640-q8ps4" Feb 25 11:20:04 crc kubenswrapper[4725]: I0225 11:20:04.647607 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nnb7\" (UniqueName: \"kubernetes.io/projected/4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d-kube-api-access-4nnb7\") pod \"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d\" (UID: \"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d\") " Feb 25 11:20:04 crc kubenswrapper[4725]: I0225 11:20:04.653945 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d-kube-api-access-4nnb7" (OuterVolumeSpecName: "kube-api-access-4nnb7") pod "4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d" (UID: "4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d"). InnerVolumeSpecName "kube-api-access-4nnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:20:04 crc kubenswrapper[4725]: I0225 11:20:04.750257 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nnb7\" (UniqueName: \"kubernetes.io/projected/4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d-kube-api-access-4nnb7\") on node \"crc\" DevicePath \"\"" Feb 25 11:20:05 crc kubenswrapper[4725]: I0225 11:20:05.146608 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533640-q8ps4" event={"ID":"4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d","Type":"ContainerDied","Data":"5866d8d26c2c599df87359cd44ee587fcd448d381fc1aeadd0d4008de5be57ae"} Feb 25 11:20:05 crc kubenswrapper[4725]: I0225 11:20:05.146665 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5866d8d26c2c599df87359cd44ee587fcd448d381fc1aeadd0d4008de5be57ae" Feb 25 11:20:05 crc kubenswrapper[4725]: I0225 11:20:05.146701 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533640-q8ps4" Feb 25 11:20:05 crc kubenswrapper[4725]: I0225 11:20:05.597572 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533634-rq22r"] Feb 25 11:20:05 crc kubenswrapper[4725]: I0225 11:20:05.605150 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533634-rq22r"] Feb 25 11:20:07 crc kubenswrapper[4725]: I0225 11:20:07.236408 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532209c0-1111-4779-9620-732c8d611e1c" path="/var/lib/kubelet/pods/532209c0-1111-4779-9620-732c8d611e1c/volumes" Feb 25 11:20:11 crc kubenswrapper[4725]: I0225 11:20:11.556299 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:20:11 crc kubenswrapper[4725]: I0225 11:20:11.557017 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:20:11 crc kubenswrapper[4725]: I0225 11:20:11.557078 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:20:11 crc kubenswrapper[4725]: I0225 11:20:11.558002 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:20:11 crc kubenswrapper[4725]: I0225 11:20:11.558090 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" gracePeriod=600 Feb 25 11:20:11 crc kubenswrapper[4725]: E0225 11:20:11.703338 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:20:12 crc kubenswrapper[4725]: I0225 11:20:12.217515 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" exitCode=0 Feb 25 11:20:12 crc kubenswrapper[4725]: I0225 11:20:12.217673 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5"} Feb 25 11:20:12 crc kubenswrapper[4725]: I0225 11:20:12.217767 4725 scope.go:117] "RemoveContainer" containerID="11e1b1cdb4e476cda22a21020fd383eb9bc627ad8cf9f3e9b918adf3b517b8b4" Feb 25 11:20:12 crc kubenswrapper[4725]: I0225 11:20:12.218442 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:20:12 crc kubenswrapper[4725]: E0225 11:20:12.218785 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:20:17 crc kubenswrapper[4725]: I0225 11:20:17.966258 4725 scope.go:117] "RemoveContainer" containerID="0f1f3c64366c6ea77197b031bf340956d44d6efe1cac7cb78e09e8e0f77ea6d9" Feb 25 11:20:25 crc kubenswrapper[4725]: I0225 11:20:25.233260 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:20:25 crc kubenswrapper[4725]: E0225 11:20:25.233933 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:20:40 crc kubenswrapper[4725]: I0225 11:20:40.224679 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:20:40 crc kubenswrapper[4725]: E0225 11:20:40.226063 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:20:43 crc kubenswrapper[4725]: I0225 11:20:43.585229 4725 generic.go:334] "Generic (PLEG): container finished" podID="a1b2db62-0e44-475c-bd55-aeceb2068aed" containerID="8a561313050d9f5994213166d0808e17a8157d5c53e70298b5de47837e97a83c" exitCode=0 Feb 25 11:20:43 crc kubenswrapper[4725]: I0225 11:20:43.585546 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" event={"ID":"a1b2db62-0e44-475c-bd55-aeceb2068aed","Type":"ContainerDied","Data":"8a561313050d9f5994213166d0808e17a8157d5c53e70298b5de47837e97a83c"} Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.057723 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.175152 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45lgk\" (UniqueName: \"kubernetes.io/projected/a1b2db62-0e44-475c-bd55-aeceb2068aed-kube-api-access-45lgk\") pod \"a1b2db62-0e44-475c-bd55-aeceb2068aed\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.175965 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-ssh-key-openstack-edpm-ipam\") pod \"a1b2db62-0e44-475c-bd55-aeceb2068aed\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.176129 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-bootstrap-combined-ca-bundle\") pod \"a1b2db62-0e44-475c-bd55-aeceb2068aed\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.176891 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-inventory\") pod \"a1b2db62-0e44-475c-bd55-aeceb2068aed\" (UID: \"a1b2db62-0e44-475c-bd55-aeceb2068aed\") " Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.182663 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a1b2db62-0e44-475c-bd55-aeceb2068aed" (UID: "a1b2db62-0e44-475c-bd55-aeceb2068aed"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.182729 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b2db62-0e44-475c-bd55-aeceb2068aed-kube-api-access-45lgk" (OuterVolumeSpecName: "kube-api-access-45lgk") pod "a1b2db62-0e44-475c-bd55-aeceb2068aed" (UID: "a1b2db62-0e44-475c-bd55-aeceb2068aed"). InnerVolumeSpecName "kube-api-access-45lgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.210583 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1b2db62-0e44-475c-bd55-aeceb2068aed" (UID: "a1b2db62-0e44-475c-bd55-aeceb2068aed"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.215654 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-inventory" (OuterVolumeSpecName: "inventory") pod "a1b2db62-0e44-475c-bd55-aeceb2068aed" (UID: "a1b2db62-0e44-475c-bd55-aeceb2068aed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.281079 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.281111 4725 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.281125 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1b2db62-0e44-475c-bd55-aeceb2068aed-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.281140 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45lgk\" (UniqueName: \"kubernetes.io/projected/a1b2db62-0e44-475c-bd55-aeceb2068aed-kube-api-access-45lgk\") on node \"crc\" DevicePath \"\"" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.614464 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" event={"ID":"a1b2db62-0e44-475c-bd55-aeceb2068aed","Type":"ContainerDied","Data":"2c681e1d102f2b23814f04faf3944d2dfb76b6dab67bf02f5a7228125d5daa28"} Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.614513 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c681e1d102f2b23814f04faf3944d2dfb76b6dab67bf02f5a7228125d5daa28" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.614573 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.743241 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6"] Feb 25 11:20:45 crc kubenswrapper[4725]: E0225 11:20:45.743746 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d" containerName="oc" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.743774 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d" containerName="oc" Feb 25 11:20:45 crc kubenswrapper[4725]: E0225 11:20:45.743789 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b2db62-0e44-475c-bd55-aeceb2068aed" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.743803 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b2db62-0e44-475c-bd55-aeceb2068aed" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.744139 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d" containerName="oc" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.744187 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b2db62-0e44-475c-bd55-aeceb2068aed" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.745111 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.755386 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.755788 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.755815 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.755812 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.756301 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6"] Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.792764 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqth\" (UniqueName: \"kubernetes.io/projected/5bbf0497-1315-4613-b6ff-c826f5cf2a75-kube-api-access-dnqth\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.793473 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.793570 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.894823 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.894914 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.894980 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqth\" (UniqueName: \"kubernetes.io/projected/5bbf0497-1315-4613-b6ff-c826f5cf2a75-kube-api-access-dnqth\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.899687 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.899940 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:45 crc kubenswrapper[4725]: I0225 11:20:45.922049 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqth\" (UniqueName: \"kubernetes.io/projected/5bbf0497-1315-4613-b6ff-c826f5cf2a75-kube-api-access-dnqth\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:46 crc kubenswrapper[4725]: I0225 11:20:46.074496 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:20:46 crc kubenswrapper[4725]: I0225 11:20:46.691047 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6"] Feb 25 11:20:46 crc kubenswrapper[4725]: W0225 11:20:46.699356 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bbf0497_1315_4613_b6ff_c826f5cf2a75.slice/crio-23c607633877bd8b4aebad6e29fb7431821282d24b4fcaeb1ad5334be6020c21 WatchSource:0}: Error finding container 23c607633877bd8b4aebad6e29fb7431821282d24b4fcaeb1ad5334be6020c21: Status 404 returned error can't find the container with id 23c607633877bd8b4aebad6e29fb7431821282d24b4fcaeb1ad5334be6020c21 Feb 25 11:20:47 crc kubenswrapper[4725]: I0225 11:20:47.637504 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" event={"ID":"5bbf0497-1315-4613-b6ff-c826f5cf2a75","Type":"ContainerStarted","Data":"23c607633877bd8b4aebad6e29fb7431821282d24b4fcaeb1ad5334be6020c21"} Feb 25 11:20:48 crc kubenswrapper[4725]: I0225 11:20:48.653654 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" event={"ID":"5bbf0497-1315-4613-b6ff-c826f5cf2a75","Type":"ContainerStarted","Data":"229586d3fa3e1dca2ebdd18bac3ae81a938c8dd7c485f63fbcfb642a07f81fb0"} Feb 25 11:20:48 crc kubenswrapper[4725]: I0225 11:20:48.694620 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" podStartSLOduration=2.779326331 podStartE2EDuration="3.694598856s" podCreationTimestamp="2026-02-25 11:20:45 +0000 UTC" firstStartedPulling="2026-02-25 11:20:46.702927493 +0000 UTC m=+1672.201509528" lastFinishedPulling="2026-02-25 11:20:47.618200028 +0000 UTC m=+1673.116782053" observedRunningTime="2026-02-25 11:20:48.684327509 +0000 UTC m=+1674.182909604" watchObservedRunningTime="2026-02-25 11:20:48.694598856 +0000 UTC m=+1674.193180891" Feb 25 11:20:52 crc kubenswrapper[4725]: I0225 11:20:52.224618 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:20:52 crc kubenswrapper[4725]: E0225 11:20:52.225147 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:21:03 crc kubenswrapper[4725]: I0225 11:21:03.226315 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:21:03 crc kubenswrapper[4725]: E0225 11:21:03.227773 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:21:16 crc kubenswrapper[4725]: I0225 11:21:16.224756 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:21:16 crc kubenswrapper[4725]: E0225 11:21:16.226104 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.042066 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-bb4jr"] Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.055587 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fdca-account-create-update-998xh"] Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.067230 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-kzkj5"] Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.083170 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4cc8-account-create-update-gxqmd"] Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.092672 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-bb4jr"] Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.101713 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-kzkj5"] Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.108592 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4cc8-account-create-update-gxqmd"] Feb 25 11:21:18 crc kubenswrapper[4725]: I0225 11:21:18.116434 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fdca-account-create-update-998xh"] Feb 25 11:21:19 crc kubenswrapper[4725]: I0225 11:21:19.234506 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="422d7ab0-0190-46dc-976e-e827bb7b48e8" path="/var/lib/kubelet/pods/422d7ab0-0190-46dc-976e-e827bb7b48e8/volumes" Feb 25 11:21:19 crc kubenswrapper[4725]: I0225 11:21:19.235398 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8094412c-eb55-4366-a7a2-0bd29cff2983" path="/var/lib/kubelet/pods/8094412c-eb55-4366-a7a2-0bd29cff2983/volumes" Feb 25 11:21:19 crc kubenswrapper[4725]: I0225 11:21:19.236288 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2ef8b2-0d39-411c-b91a-a396aa246f66" path="/var/lib/kubelet/pods/ad2ef8b2-0d39-411c-b91a-a396aa246f66/volumes" Feb 25 11:21:19 crc kubenswrapper[4725]: I0225 11:21:19.237271 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09" path="/var/lib/kubelet/pods/bc03bd2e-9d03-4ff9-ba01-c24bd7c00b09/volumes" Feb 25 11:21:22 crc kubenswrapper[4725]: I0225 11:21:22.035504 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jfvkj"] Feb 25 11:21:22 crc kubenswrapper[4725]: I0225 11:21:22.044409 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jfvkj"] Feb 25 11:21:22 crc kubenswrapper[4725]: I0225 11:21:22.056998 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e644-account-create-update-g2stm"] Feb 25 11:21:22 crc kubenswrapper[4725]: I0225 11:21:22.067112 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e644-account-create-update-g2stm"] Feb 25 11:21:23 crc kubenswrapper[4725]: I0225 11:21:23.235503 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43468cb6-ecc1-44f0-b5f0-7de8f76cc465" path="/var/lib/kubelet/pods/43468cb6-ecc1-44f0-b5f0-7de8f76cc465/volumes" Feb 25 11:21:23 crc kubenswrapper[4725]: I0225 11:21:23.237436 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c835a1-6f18-44d6-a4ce-669692e0e6d9" path="/var/lib/kubelet/pods/e7c835a1-6f18-44d6-a4ce-669692e0e6d9/volumes" Feb 25 11:21:28 crc kubenswrapper[4725]: I0225 11:21:28.224811 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:21:28 crc kubenswrapper[4725]: E0225 11:21:28.225888 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:21:41 crc kubenswrapper[4725]: I0225 11:21:41.225327 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:21:41 crc kubenswrapper[4725]: E0225 11:21:41.226530 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:21:43 crc kubenswrapper[4725]: I0225 11:21:43.050032 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zf9sp"] Feb 25 11:21:43 crc kubenswrapper[4725]: I0225 11:21:43.059983 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zf9sp"] Feb 25 11:21:43 crc kubenswrapper[4725]: I0225 11:21:43.242383 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6784bd0f-0863-4990-bc78-c04561fbd465" path="/var/lib/kubelet/pods/6784bd0f-0863-4990-bc78-c04561fbd465/volumes" Feb 25 11:21:55 crc kubenswrapper[4725]: I0225 11:21:55.028134 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jtvbl"] Feb 25 11:21:55 crc kubenswrapper[4725]: I0225 11:21:55.035346 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jtvbl"] Feb 25 11:21:55 crc kubenswrapper[4725]: I0225 11:21:55.244711 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fc0f62-6868-40e9-a04e-5de23ca3e5fe" path="/var/lib/kubelet/pods/88fc0f62-6868-40e9-a04e-5de23ca3e5fe/volumes" Feb 25 11:21:55 crc kubenswrapper[4725]: I0225 11:21:55.245029 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:21:55 crc kubenswrapper[4725]: E0225 11:21:55.245404 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.170365 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533642-wgj8t"] Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.173324 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.175810 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.176743 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.177211 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.183432 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533642-wgj8t"] Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.223446 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp74z\" (UniqueName: \"kubernetes.io/projected/685d89a8-03f3-40ab-849f-27f44039ebe9-kube-api-access-hp74z\") pod \"auto-csr-approver-29533642-wgj8t\" (UID: \"685d89a8-03f3-40ab-849f-27f44039ebe9\") " pod="openshift-infra/auto-csr-approver-29533642-wgj8t" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.326358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp74z\" (UniqueName: \"kubernetes.io/projected/685d89a8-03f3-40ab-849f-27f44039ebe9-kube-api-access-hp74z\") pod \"auto-csr-approver-29533642-wgj8t\" (UID: \"685d89a8-03f3-40ab-849f-27f44039ebe9\") " pod="openshift-infra/auto-csr-approver-29533642-wgj8t" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.357404 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp74z\" (UniqueName: \"kubernetes.io/projected/685d89a8-03f3-40ab-849f-27f44039ebe9-kube-api-access-hp74z\") pod \"auto-csr-approver-29533642-wgj8t\" (UID: \"685d89a8-03f3-40ab-849f-27f44039ebe9\") " pod="openshift-infra/auto-csr-approver-29533642-wgj8t" Feb 25 11:22:00 crc kubenswrapper[4725]: I0225 11:22:00.513714 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" Feb 25 11:22:01 crc kubenswrapper[4725]: I0225 11:22:01.009007 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533642-wgj8t"] Feb 25 11:22:01 crc kubenswrapper[4725]: I0225 11:22:01.427860 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" event={"ID":"685d89a8-03f3-40ab-849f-27f44039ebe9","Type":"ContainerStarted","Data":"f8d498f2fe58aa707b8a635df2284808d828ca0f00db08b756943e9f56527ef9"} Feb 25 11:22:03 crc kubenswrapper[4725]: I0225 11:22:03.451207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" event={"ID":"685d89a8-03f3-40ab-849f-27f44039ebe9","Type":"ContainerStarted","Data":"ea4c686170b5ab97a3c63c9f80407722fe3ffbabcf0db08138db8f066669bc36"} Feb 25 11:22:03 crc kubenswrapper[4725]: I0225 11:22:03.476661 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" podStartSLOduration=1.423507049 podStartE2EDuration="3.476639475s" podCreationTimestamp="2026-02-25 11:22:00 +0000 UTC" firstStartedPulling="2026-02-25 11:22:00.99628644 +0000 UTC m=+1746.494868505" lastFinishedPulling="2026-02-25 11:22:03.049418886 +0000 UTC m=+1748.548000931" observedRunningTime="2026-02-25 11:22:03.465526187 +0000 UTC m=+1748.964108212" watchObservedRunningTime="2026-02-25 11:22:03.476639475 +0000 UTC m=+1748.975221510" Feb 25 11:22:04 crc kubenswrapper[4725]: I0225 11:22:04.467396 4725 generic.go:334] "Generic (PLEG): container finished" podID="685d89a8-03f3-40ab-849f-27f44039ebe9" containerID="ea4c686170b5ab97a3c63c9f80407722fe3ffbabcf0db08138db8f066669bc36" exitCode=0 Feb 25 11:22:04 crc kubenswrapper[4725]: I0225 11:22:04.467484 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" event={"ID":"685d89a8-03f3-40ab-849f-27f44039ebe9","Type":"ContainerDied","Data":"ea4c686170b5ab97a3c63c9f80407722fe3ffbabcf0db08138db8f066669bc36"} Feb 25 11:22:05 crc kubenswrapper[4725]: I0225 11:22:05.814080 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" Feb 25 11:22:05 crc kubenswrapper[4725]: I0225 11:22:05.971544 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp74z\" (UniqueName: \"kubernetes.io/projected/685d89a8-03f3-40ab-849f-27f44039ebe9-kube-api-access-hp74z\") pod \"685d89a8-03f3-40ab-849f-27f44039ebe9\" (UID: \"685d89a8-03f3-40ab-849f-27f44039ebe9\") " Feb 25 11:22:05 crc kubenswrapper[4725]: I0225 11:22:05.977975 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685d89a8-03f3-40ab-849f-27f44039ebe9-kube-api-access-hp74z" (OuterVolumeSpecName: "kube-api-access-hp74z") pod "685d89a8-03f3-40ab-849f-27f44039ebe9" (UID: "685d89a8-03f3-40ab-849f-27f44039ebe9"). InnerVolumeSpecName "kube-api-access-hp74z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:22:06 crc kubenswrapper[4725]: I0225 11:22:06.073557 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp74z\" (UniqueName: \"kubernetes.io/projected/685d89a8-03f3-40ab-849f-27f44039ebe9-kube-api-access-hp74z\") on node \"crc\" DevicePath \"\"" Feb 25 11:22:06 crc kubenswrapper[4725]: I0225 11:22:06.490477 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" event={"ID":"685d89a8-03f3-40ab-849f-27f44039ebe9","Type":"ContainerDied","Data":"f8d498f2fe58aa707b8a635df2284808d828ca0f00db08b756943e9f56527ef9"} Feb 25 11:22:06 crc kubenswrapper[4725]: I0225 11:22:06.490528 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d498f2fe58aa707b8a635df2284808d828ca0f00db08b756943e9f56527ef9" Feb 25 11:22:06 crc kubenswrapper[4725]: I0225 11:22:06.490600 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533642-wgj8t" Feb 25 11:22:06 crc kubenswrapper[4725]: I0225 11:22:06.542475 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533636-7qvvt"] Feb 25 11:22:06 crc kubenswrapper[4725]: I0225 11:22:06.551819 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533636-7qvvt"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.082798 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e785-account-create-update-hhpzr"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.099771 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-9l6bp"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.108056 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e785-account-create-update-hhpzr"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.115758 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7f38-account-create-update-ggt5m"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.123124 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6xkqp"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.130311 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-9l6bp"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.137192 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7f38-account-create-update-ggt5m"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.143868 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6xkqp"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.151692 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1a81-account-create-update-7k7sp"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.164652 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1a81-account-create-update-7k7sp"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.165384 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ghddn"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.174245 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ghddn"] Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.243374 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f83fe8-1a7b-4411-9dc3-611c0affe393" path="/var/lib/kubelet/pods/17f83fe8-1a7b-4411-9dc3-611c0affe393/volumes" Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.244099 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262e67d4-08ee-405a-aa45-14c222a8e9f1" path="/var/lib/kubelet/pods/262e67d4-08ee-405a-aa45-14c222a8e9f1/volumes" Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.244608 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31ef9958-bb4e-4bf1-a118-d11d04bff97b" path="/var/lib/kubelet/pods/31ef9958-bb4e-4bf1-a118-d11d04bff97b/volumes" Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.245146 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43106b29-d57b-47d4-90dd-9ea16422dc05" path="/var/lib/kubelet/pods/43106b29-d57b-47d4-90dd-9ea16422dc05/volumes" Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.246091 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a6913c-322e-4be4-acd7-29a649757554" path="/var/lib/kubelet/pods/63a6913c-322e-4be4-acd7-29a649757554/volumes" Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.246644 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b501c3bd-07f8-4780-8b55-14db55bc346f" path="/var/lib/kubelet/pods/b501c3bd-07f8-4780-8b55-14db55bc346f/volumes" Feb 25 11:22:07 crc kubenswrapper[4725]: I0225 11:22:07.247228 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8d476c-f390-4e03-a518-e0998e0586df" path="/var/lib/kubelet/pods/de8d476c-f390-4e03-a518-e0998e0586df/volumes" Feb 25 11:22:09 crc kubenswrapper[4725]: I0225 11:22:09.224900 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:22:09 crc kubenswrapper[4725]: E0225 11:22:09.225586 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:22:12 crc kubenswrapper[4725]: I0225 11:22:12.037696 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-thmbj"] Feb 25 11:22:12 crc kubenswrapper[4725]: I0225 11:22:12.054931 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-thmbj"] Feb 25 11:22:13 crc kubenswrapper[4725]: I0225 11:22:13.234544 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a4bfbae-237f-4d52-9b5d-f47217a2c88c" path="/var/lib/kubelet/pods/6a4bfbae-237f-4d52-9b5d-f47217a2c88c/volumes" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.094105 4725 scope.go:117] "RemoveContainer" containerID="eebb3933bcc50618730ab1a0fa945eb463582eb4b6ca24b3dd338def8b68d79b" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.141164 4725 scope.go:117] "RemoveContainer" containerID="04fe06c2b6d673e2ef30572a7e476f5fda8aecca44a89396f766f1cc9d31f721" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.206257 4725 scope.go:117] "RemoveContainer" containerID="71dc519a8394ebf396361c5207f79340bf5433e2dda989ab616b9bbd6e2d41ad" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.291460 4725 scope.go:117] "RemoveContainer" containerID="32d8bb86ee8f58e701702a0b106c6c70953402777c533db91beae7134a2416af" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.314682 4725 scope.go:117] "RemoveContainer" containerID="3c5dcef313b89405ca14bcd4691e17176c33f08f540c16811345ad7cd839f737" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.376264 4725 scope.go:117] "RemoveContainer" containerID="6a05cc28b3b8bceca89cd70d0309955773352e37a6023bfc9dabdc1113178faf" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.410275 4725 scope.go:117] "RemoveContainer" containerID="3cdaf2838439ac380a611605ecaa3171e841cf7a49575bfd2d7230d9cc03c5d6" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.462881 4725 scope.go:117] "RemoveContainer" containerID="3d3e86dc4494d1dc674a9e4d1bcf89efcc35c66a729e039eb01e34bf77a14955" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.489519 4725 scope.go:117] "RemoveContainer" containerID="4e542341c3e93604d629b01baf6f5217bb7546587b44eabe4c3a16ae31a4e166" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.509962 4725 scope.go:117] "RemoveContainer" containerID="cec057bfffb3bb2d8ca364f7a56b71c477134ce0ec09829a5ce7ddee17499e07" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.529609 4725 scope.go:117] "RemoveContainer" containerID="4d89a61cf32541dd22b107d0313929e2b1347111d44d1af4d344b8ea2e1aa9ec" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.556067 4725 scope.go:117] "RemoveContainer" containerID="806db84677e11365b8ea664801ada795eed2f35ed7ced59f3563df3bf8411ec5" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.578107 4725 scope.go:117] "RemoveContainer" containerID="08157017ad2ccea94a32e94385eaaf166984488b53f4b13f17f761edb2ec0134" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.598118 4725 scope.go:117] "RemoveContainer" containerID="619a4c29f7698049b8e454a562d218ae23d6b3401c83a252ce8839e13d98d54d" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.622352 4725 scope.go:117] "RemoveContainer" containerID="aa58b712619d1f2d6db3ad97f9a2389e783cd9d557280bc73843561d8ea6bbc1" Feb 25 11:22:18 crc kubenswrapper[4725]: I0225 11:22:18.647255 4725 scope.go:117] "RemoveContainer" containerID="5654d16a0de2872aaf5c7f754bc590b5077a16b7fbfd4f83af50aabaad603412" Feb 25 11:22:22 crc kubenswrapper[4725]: I0225 11:22:22.225172 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:22:22 crc kubenswrapper[4725]: E0225 11:22:22.226136 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:22:37 crc kubenswrapper[4725]: I0225 11:22:37.224552 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:22:37 crc kubenswrapper[4725]: E0225 11:22:37.225255 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:22:41 crc kubenswrapper[4725]: I0225 11:22:41.067953 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7mfzn"] Feb 25 11:22:41 crc kubenswrapper[4725]: I0225 11:22:41.081269 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7mfzn"] Feb 25 11:22:41 crc kubenswrapper[4725]: I0225 11:22:41.243610 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a6a21f-d099-43a7-96f6-51c056d4568c" path="/var/lib/kubelet/pods/23a6a21f-d099-43a7-96f6-51c056d4568c/volumes" Feb 25 11:22:42 crc kubenswrapper[4725]: I0225 11:22:42.948552 4725 generic.go:334] "Generic (PLEG): container finished" podID="5bbf0497-1315-4613-b6ff-c826f5cf2a75" containerID="229586d3fa3e1dca2ebdd18bac3ae81a938c8dd7c485f63fbcfb642a07f81fb0" exitCode=0 Feb 25 11:22:42 crc kubenswrapper[4725]: I0225 11:22:42.948631 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" event={"ID":"5bbf0497-1315-4613-b6ff-c826f5cf2a75","Type":"ContainerDied","Data":"229586d3fa3e1dca2ebdd18bac3ae81a938c8dd7c485f63fbcfb642a07f81fb0"} Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.435710 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.567008 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqth\" (UniqueName: \"kubernetes.io/projected/5bbf0497-1315-4613-b6ff-c826f5cf2a75-kube-api-access-dnqth\") pod \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.567630 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-inventory\") pod \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.567689 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-ssh-key-openstack-edpm-ipam\") pod \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\" (UID: \"5bbf0497-1315-4613-b6ff-c826f5cf2a75\") " Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.575150 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bbf0497-1315-4613-b6ff-c826f5cf2a75-kube-api-access-dnqth" (OuterVolumeSpecName: "kube-api-access-dnqth") pod "5bbf0497-1315-4613-b6ff-c826f5cf2a75" (UID: "5bbf0497-1315-4613-b6ff-c826f5cf2a75"). InnerVolumeSpecName "kube-api-access-dnqth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.606776 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5bbf0497-1315-4613-b6ff-c826f5cf2a75" (UID: "5bbf0497-1315-4613-b6ff-c826f5cf2a75"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.616362 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-inventory" (OuterVolumeSpecName: "inventory") pod "5bbf0497-1315-4613-b6ff-c826f5cf2a75" (UID: "5bbf0497-1315-4613-b6ff-c826f5cf2a75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.671062 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqth\" (UniqueName: \"kubernetes.io/projected/5bbf0497-1315-4613-b6ff-c826f5cf2a75-kube-api-access-dnqth\") on node \"crc\" DevicePath \"\"" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.671112 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.671133 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bbf0497-1315-4613-b6ff-c826f5cf2a75-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.982012 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" event={"ID":"5bbf0497-1315-4613-b6ff-c826f5cf2a75","Type":"ContainerDied","Data":"23c607633877bd8b4aebad6e29fb7431821282d24b4fcaeb1ad5334be6020c21"} Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.982057 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23c607633877bd8b4aebad6e29fb7431821282d24b4fcaeb1ad5334be6020c21" Feb 25 11:22:44 crc kubenswrapper[4725]: I0225 11:22:44.982121 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.075174 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb"] Feb 25 11:22:45 crc kubenswrapper[4725]: E0225 11:22:45.075654 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bbf0497-1315-4613-b6ff-c826f5cf2a75" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.075680 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bbf0497-1315-4613-b6ff-c826f5cf2a75" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 11:22:45 crc kubenswrapper[4725]: E0225 11:22:45.075695 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685d89a8-03f3-40ab-849f-27f44039ebe9" containerName="oc" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.075704 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="685d89a8-03f3-40ab-849f-27f44039ebe9" containerName="oc" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.075940 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="685d89a8-03f3-40ab-849f-27f44039ebe9" containerName="oc" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.075968 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bbf0497-1315-4613-b6ff-c826f5cf2a75" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.076716 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.079219 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.083490 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb"] Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.083783 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.084007 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.085656 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.182178 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.182536 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.182718 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw87g\" (UniqueName: \"kubernetes.io/projected/d3ef192a-3ad7-445f-b029-580b9e395372-kube-api-access-sw87g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.284906 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.285459 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.285574 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw87g\" (UniqueName: \"kubernetes.io/projected/d3ef192a-3ad7-445f-b029-580b9e395372-kube-api-access-sw87g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.291887 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.292082 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.304321 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw87g\" (UniqueName: \"kubernetes.io/projected/d3ef192a-3ad7-445f-b029-580b9e395372-kube-api-access-sw87g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:45 crc kubenswrapper[4725]: I0225 11:22:45.395366 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:22:46 crc kubenswrapper[4725]: I0225 11:22:46.017262 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb"] Feb 25 11:22:46 crc kubenswrapper[4725]: I0225 11:22:46.029416 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:22:47 crc kubenswrapper[4725]: I0225 11:22:47.005947 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" event={"ID":"d3ef192a-3ad7-445f-b029-580b9e395372","Type":"ContainerStarted","Data":"7cba8524ea71e0a77203d0937f0db14589a8155323f026b597782bcc3fe8df06"} Feb 25 11:22:48 crc kubenswrapper[4725]: I0225 11:22:48.022255 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" event={"ID":"d3ef192a-3ad7-445f-b029-580b9e395372","Type":"ContainerStarted","Data":"65566e146c13da89e257389fef1e9ddd5e8eeba94b823f6791e763320e6c5bb7"} Feb 25 11:22:48 crc kubenswrapper[4725]: I0225 11:22:48.039818 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-djg6t"] Feb 25 11:22:48 crc kubenswrapper[4725]: I0225 11:22:48.051923 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-djg6t"] Feb 25 11:22:48 crc kubenswrapper[4725]: I0225 11:22:48.058260 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" podStartSLOduration=2.309421494 podStartE2EDuration="3.058242081s" podCreationTimestamp="2026-02-25 11:22:45 +0000 UTC" firstStartedPulling="2026-02-25 11:22:46.029026801 +0000 UTC m=+1791.527608856" lastFinishedPulling="2026-02-25 11:22:46.777847418 +0000 UTC m=+1792.276429443" observedRunningTime="2026-02-25 11:22:48.051171771 +0000 UTC m=+1793.549753806" watchObservedRunningTime="2026-02-25 11:22:48.058242081 +0000 UTC m=+1793.556824106" Feb 25 11:22:49 crc kubenswrapper[4725]: I0225 11:22:49.224498 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:22:49 crc kubenswrapper[4725]: E0225 11:22:49.225245 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:22:49 crc kubenswrapper[4725]: I0225 11:22:49.240430 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76768b73-31d1-407a-90e7-9583d2b3a773" path="/var/lib/kubelet/pods/76768b73-31d1-407a-90e7-9583d2b3a773/volumes" Feb 25 11:22:52 crc kubenswrapper[4725]: I0225 11:22:52.041963 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-skknf"] Feb 25 11:22:52 crc kubenswrapper[4725]: I0225 11:22:52.058209 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fzl9q"] Feb 25 11:22:52 crc kubenswrapper[4725]: I0225 11:22:52.070285 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-skknf"] Feb 25 11:22:52 crc kubenswrapper[4725]: I0225 11:22:52.081527 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fzl9q"] Feb 25 11:22:53 crc kubenswrapper[4725]: I0225 11:22:53.239682 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc96366a-6045-408e-9be6-07abc53c1b3e" path="/var/lib/kubelet/pods/cc96366a-6045-408e-9be6-07abc53c1b3e/volumes" Feb 25 11:22:53 crc kubenswrapper[4725]: I0225 11:22:53.241069 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf601308-e467-48ee-998c-7a2ecf04d92c" path="/var/lib/kubelet/pods/cf601308-e467-48ee-998c-7a2ecf04d92c/volumes" Feb 25 11:23:01 crc kubenswrapper[4725]: I0225 11:23:01.225425 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:23:01 crc kubenswrapper[4725]: E0225 11:23:01.226578 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:23:06 crc kubenswrapper[4725]: I0225 11:23:06.044509 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7mk8j"] Feb 25 11:23:06 crc kubenswrapper[4725]: I0225 11:23:06.052663 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7mk8j"] Feb 25 11:23:07 crc kubenswrapper[4725]: I0225 11:23:07.245545 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe5daf6-23bb-4480-8bd7-724dbb47ad3d" path="/var/lib/kubelet/pods/afe5daf6-23bb-4480-8bd7-724dbb47ad3d/volumes" Feb 25 11:23:14 crc kubenswrapper[4725]: I0225 11:23:14.226244 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:23:14 crc kubenswrapper[4725]: E0225 11:23:14.227164 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:23:18 crc kubenswrapper[4725]: I0225 11:23:18.952604 4725 scope.go:117] "RemoveContainer" containerID="3ed375a5c0694529b49eecaf54ba821794fc615f8cc55fe46ef417bc85fd8d5b" Feb 25 11:23:19 crc kubenswrapper[4725]: I0225 11:23:19.012906 4725 scope.go:117] "RemoveContainer" containerID="ce68badea7af2996cd0c81c8799313c0b2dc12722be1709a23bbb44d9a9f890c" Feb 25 11:23:19 crc kubenswrapper[4725]: I0225 11:23:19.082409 4725 scope.go:117] "RemoveContainer" containerID="005c1a8807f48f2b85fbba453f6c6664a70ca409b6c51dfdab7deae6234c2706" Feb 25 11:23:19 crc kubenswrapper[4725]: I0225 11:23:19.143113 4725 scope.go:117] "RemoveContainer" containerID="fb2093968cb49803dda5fbd5f5111472618358218c96b8de28be014644661098" Feb 25 11:23:19 crc kubenswrapper[4725]: I0225 11:23:19.182370 4725 scope.go:117] "RemoveContainer" containerID="4221c2a6df90a5d299315a43805dab436dd16b2d9f7c256685563fd04185ddc6" Feb 25 11:23:25 crc kubenswrapper[4725]: I0225 11:23:25.462186 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:23:25 crc kubenswrapper[4725]: E0225 11:23:25.462880 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:23:40 crc kubenswrapper[4725]: I0225 11:23:40.224568 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:23:40 crc kubenswrapper[4725]: E0225 11:23:40.225962 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:23:47 crc kubenswrapper[4725]: I0225 11:23:47.037102 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-87xvm"] Feb 25 11:23:47 crc kubenswrapper[4725]: I0225 11:23:47.046149 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-87xvm"] Feb 25 11:23:47 crc kubenswrapper[4725]: I0225 11:23:47.250914 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97586ed7-2c87-4ebc-946e-56e4fab86e31" path="/var/lib/kubelet/pods/97586ed7-2c87-4ebc-946e-56e4fab86e31/volumes" Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.054119 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-725d-account-create-update-kbmpr"] Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.065917 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ce2a-account-create-update-4ls5m"] Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.074944 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-48b7w"] Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.083436 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fmmml"] Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.090429 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-725d-account-create-update-kbmpr"] Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.097281 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fmmml"] Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.104393 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ce2a-account-create-update-4ls5m"] Feb 25 11:23:48 crc kubenswrapper[4725]: I0225 11:23:48.112079 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-48b7w"] Feb 25 11:23:49 crc kubenswrapper[4725]: I0225 11:23:49.043937 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5c48-account-create-update-pd8sb"] Feb 25 11:23:49 crc kubenswrapper[4725]: I0225 11:23:49.060796 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5c48-account-create-update-pd8sb"] Feb 25 11:23:49 crc kubenswrapper[4725]: I0225 11:23:49.247183 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15c22dd9-63a4-44f0-a275-bd8d6415fff1" path="/var/lib/kubelet/pods/15c22dd9-63a4-44f0-a275-bd8d6415fff1/volumes" Feb 25 11:23:49 crc kubenswrapper[4725]: I0225 11:23:49.248454 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967eb016-3ed0-4d88-a839-e753c7a6e9a5" path="/var/lib/kubelet/pods/967eb016-3ed0-4d88-a839-e753c7a6e9a5/volumes" Feb 25 11:23:49 crc kubenswrapper[4725]: I0225 11:23:49.249750 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac81c472-c14e-4190-a40d-ed4a19e13dd7" path="/var/lib/kubelet/pods/ac81c472-c14e-4190-a40d-ed4a19e13dd7/volumes" Feb 25 11:23:49 crc kubenswrapper[4725]: I0225 11:23:49.251024 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b049d6-afa5-49eb-8bef-64de2f0672b5" path="/var/lib/kubelet/pods/e8b049d6-afa5-49eb-8bef-64de2f0672b5/volumes" Feb 25 11:23:49 crc kubenswrapper[4725]: I0225 11:23:49.252962 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed75b89a-43a5-4557-b8e2-a8f730bf8e74" path="/var/lib/kubelet/pods/ed75b89a-43a5-4557-b8e2-a8f730bf8e74/volumes" Feb 25 11:23:52 crc kubenswrapper[4725]: I0225 11:23:52.224806 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:23:52 crc kubenswrapper[4725]: E0225 11:23:52.225651 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:23:55 crc kubenswrapper[4725]: I0225 11:23:55.726988 4725 generic.go:334] "Generic (PLEG): container finished" podID="d3ef192a-3ad7-445f-b029-580b9e395372" containerID="65566e146c13da89e257389fef1e9ddd5e8eeba94b823f6791e763320e6c5bb7" exitCode=0 Feb 25 11:23:55 crc kubenswrapper[4725]: I0225 11:23:55.727130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" event={"ID":"d3ef192a-3ad7-445f-b029-580b9e395372","Type":"ContainerDied","Data":"65566e146c13da89e257389fef1e9ddd5e8eeba94b823f6791e763320e6c5bb7"} Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.156306 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.203328 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-inventory\") pod \"d3ef192a-3ad7-445f-b029-580b9e395372\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.203427 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw87g\" (UniqueName: \"kubernetes.io/projected/d3ef192a-3ad7-445f-b029-580b9e395372-kube-api-access-sw87g\") pod \"d3ef192a-3ad7-445f-b029-580b9e395372\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.203525 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-ssh-key-openstack-edpm-ipam\") pod \"d3ef192a-3ad7-445f-b029-580b9e395372\" (UID: \"d3ef192a-3ad7-445f-b029-580b9e395372\") " Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.210130 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ef192a-3ad7-445f-b029-580b9e395372-kube-api-access-sw87g" (OuterVolumeSpecName: "kube-api-access-sw87g") pod "d3ef192a-3ad7-445f-b029-580b9e395372" (UID: "d3ef192a-3ad7-445f-b029-580b9e395372"). InnerVolumeSpecName "kube-api-access-sw87g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.242287 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-inventory" (OuterVolumeSpecName: "inventory") pod "d3ef192a-3ad7-445f-b029-580b9e395372" (UID: "d3ef192a-3ad7-445f-b029-580b9e395372"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.249593 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d3ef192a-3ad7-445f-b029-580b9e395372" (UID: "d3ef192a-3ad7-445f-b029-580b9e395372"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.305784 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.305825 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ef192a-3ad7-445f-b029-580b9e395372-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.305895 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw87g\" (UniqueName: \"kubernetes.io/projected/d3ef192a-3ad7-445f-b029-580b9e395372-kube-api-access-sw87g\") on node \"crc\" DevicePath \"\"" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.758256 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" event={"ID":"d3ef192a-3ad7-445f-b029-580b9e395372","Type":"ContainerDied","Data":"7cba8524ea71e0a77203d0937f0db14589a8155323f026b597782bcc3fe8df06"} Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.758311 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cba8524ea71e0a77203d0937f0db14589a8155323f026b597782bcc3fe8df06" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.758430 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.859066 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh"] Feb 25 11:23:57 crc kubenswrapper[4725]: E0225 11:23:57.859519 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ef192a-3ad7-445f-b029-580b9e395372" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.859554 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ef192a-3ad7-445f-b029-580b9e395372" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.859785 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ef192a-3ad7-445f-b029-580b9e395372" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.860533 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.862917 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.863192 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.863358 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.869182 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.879912 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh"] Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.921141 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.921440 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nls7t\" (UniqueName: \"kubernetes.io/projected/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-kube-api-access-nls7t\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:57 crc kubenswrapper[4725]: I0225 11:23:57.921670 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.024034 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nls7t\" (UniqueName: \"kubernetes.io/projected/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-kube-api-access-nls7t\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.024338 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.024413 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.027945 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.028817 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.063517 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nls7t\" (UniqueName: \"kubernetes.io/projected/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-kube-api-access-nls7t\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p46vh\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.196490 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:23:58 crc kubenswrapper[4725]: I0225 11:23:58.799494 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh"] Feb 25 11:23:59 crc kubenswrapper[4725]: I0225 11:23:59.795343 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" event={"ID":"6d50a11f-90c8-490f-90a3-9fb2c14f2bea","Type":"ContainerStarted","Data":"9b53e9b25aea7c66210b0a6d5e89372a90567708c73b97c86489eea0e96c6bb8"} Feb 25 11:23:59 crc kubenswrapper[4725]: I0225 11:23:59.795649 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" event={"ID":"6d50a11f-90c8-490f-90a3-9fb2c14f2bea","Type":"ContainerStarted","Data":"815b29787e018145bea810aa9934065fc0aa3a3379567a1d5fe4810176b2b240"} Feb 25 11:23:59 crc kubenswrapper[4725]: I0225 11:23:59.823173 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" podStartSLOduration=2.127025102 podStartE2EDuration="2.823153612s" podCreationTimestamp="2026-02-25 11:23:57 +0000 UTC" firstStartedPulling="2026-02-25 11:23:58.792712911 +0000 UTC m=+1864.291294946" lastFinishedPulling="2026-02-25 11:23:59.488841391 +0000 UTC m=+1864.987423456" observedRunningTime="2026-02-25 11:23:59.818146077 +0000 UTC m=+1865.316728112" watchObservedRunningTime="2026-02-25 11:23:59.823153612 +0000 UTC m=+1865.321735637" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.138509 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533644-2jhmj"] Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.139680 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533644-2jhmj" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.142442 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.142896 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.144304 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.159572 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533644-2jhmj"] Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.171086 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dss\" (UniqueName: \"kubernetes.io/projected/07212026-7350-41cd-8012-c427bd678f3c-kube-api-access-w2dss\") pod \"auto-csr-approver-29533644-2jhmj\" (UID: \"07212026-7350-41cd-8012-c427bd678f3c\") " pod="openshift-infra/auto-csr-approver-29533644-2jhmj" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.272741 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dss\" (UniqueName: \"kubernetes.io/projected/07212026-7350-41cd-8012-c427bd678f3c-kube-api-access-w2dss\") pod \"auto-csr-approver-29533644-2jhmj\" (UID: \"07212026-7350-41cd-8012-c427bd678f3c\") " pod="openshift-infra/auto-csr-approver-29533644-2jhmj" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.293379 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dss\" (UniqueName: \"kubernetes.io/projected/07212026-7350-41cd-8012-c427bd678f3c-kube-api-access-w2dss\") pod \"auto-csr-approver-29533644-2jhmj\" (UID: \"07212026-7350-41cd-8012-c427bd678f3c\") " pod="openshift-infra/auto-csr-approver-29533644-2jhmj" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.472449 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533644-2jhmj" Feb 25 11:24:00 crc kubenswrapper[4725]: I0225 11:24:00.960402 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533644-2jhmj"] Feb 25 11:24:00 crc kubenswrapper[4725]: W0225 11:24:00.969080 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07212026_7350_41cd_8012_c427bd678f3c.slice/crio-ad839679d7775e23944b3c959f52e16ed088a73536eb8394a93a39e09e607a1b WatchSource:0}: Error finding container ad839679d7775e23944b3c959f52e16ed088a73536eb8394a93a39e09e607a1b: Status 404 returned error can't find the container with id ad839679d7775e23944b3c959f52e16ed088a73536eb8394a93a39e09e607a1b Feb 25 11:24:01 crc kubenswrapper[4725]: I0225 11:24:01.818956 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533644-2jhmj" event={"ID":"07212026-7350-41cd-8012-c427bd678f3c","Type":"ContainerStarted","Data":"ad839679d7775e23944b3c959f52e16ed088a73536eb8394a93a39e09e607a1b"} Feb 25 11:24:02 crc kubenswrapper[4725]: I0225 11:24:02.846082 4725 generic.go:334] "Generic (PLEG): container finished" podID="07212026-7350-41cd-8012-c427bd678f3c" containerID="32fe34383390b195eab6a2d1793b396e86bbfbb9b05b4b9c711cdc833de263d7" exitCode=0 Feb 25 11:24:02 crc kubenswrapper[4725]: I0225 11:24:02.846147 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533644-2jhmj" event={"ID":"07212026-7350-41cd-8012-c427bd678f3c","Type":"ContainerDied","Data":"32fe34383390b195eab6a2d1793b396e86bbfbb9b05b4b9c711cdc833de263d7"} Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.283811 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533644-2jhmj" Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.399507 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2dss\" (UniqueName: \"kubernetes.io/projected/07212026-7350-41cd-8012-c427bd678f3c-kube-api-access-w2dss\") pod \"07212026-7350-41cd-8012-c427bd678f3c\" (UID: \"07212026-7350-41cd-8012-c427bd678f3c\") " Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.406115 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07212026-7350-41cd-8012-c427bd678f3c-kube-api-access-w2dss" (OuterVolumeSpecName: "kube-api-access-w2dss") pod "07212026-7350-41cd-8012-c427bd678f3c" (UID: "07212026-7350-41cd-8012-c427bd678f3c"). InnerVolumeSpecName "kube-api-access-w2dss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.501276 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2dss\" (UniqueName: \"kubernetes.io/projected/07212026-7350-41cd-8012-c427bd678f3c-kube-api-access-w2dss\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.868570 4725 generic.go:334] "Generic (PLEG): container finished" podID="6d50a11f-90c8-490f-90a3-9fb2c14f2bea" containerID="9b53e9b25aea7c66210b0a6d5e89372a90567708c73b97c86489eea0e96c6bb8" exitCode=0 Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.868644 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" event={"ID":"6d50a11f-90c8-490f-90a3-9fb2c14f2bea","Type":"ContainerDied","Data":"9b53e9b25aea7c66210b0a6d5e89372a90567708c73b97c86489eea0e96c6bb8"} Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.871020 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533644-2jhmj" event={"ID":"07212026-7350-41cd-8012-c427bd678f3c","Type":"ContainerDied","Data":"ad839679d7775e23944b3c959f52e16ed088a73536eb8394a93a39e09e607a1b"} Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.871289 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad839679d7775e23944b3c959f52e16ed088a73536eb8394a93a39e09e607a1b" Feb 25 11:24:04 crc kubenswrapper[4725]: I0225 11:24:04.871130 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533644-2jhmj" Feb 25 11:24:05 crc kubenswrapper[4725]: I0225 11:24:05.370707 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533638-fgdv8"] Feb 25 11:24:05 crc kubenswrapper[4725]: I0225 11:24:05.380272 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533638-fgdv8"] Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.224462 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:24:06 crc kubenswrapper[4725]: E0225 11:24:06.225144 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.383216 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.439380 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-ssh-key-openstack-edpm-ipam\") pod \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.439870 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nls7t\" (UniqueName: \"kubernetes.io/projected/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-kube-api-access-nls7t\") pod \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.440062 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-inventory\") pod \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\" (UID: \"6d50a11f-90c8-490f-90a3-9fb2c14f2bea\") " Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.445446 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-kube-api-access-nls7t" (OuterVolumeSpecName: "kube-api-access-nls7t") pod "6d50a11f-90c8-490f-90a3-9fb2c14f2bea" (UID: "6d50a11f-90c8-490f-90a3-9fb2c14f2bea"). InnerVolumeSpecName "kube-api-access-nls7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.466960 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-inventory" (OuterVolumeSpecName: "inventory") pod "6d50a11f-90c8-490f-90a3-9fb2c14f2bea" (UID: "6d50a11f-90c8-490f-90a3-9fb2c14f2bea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.475097 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6d50a11f-90c8-490f-90a3-9fb2c14f2bea" (UID: "6d50a11f-90c8-490f-90a3-9fb2c14f2bea"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.542560 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.542717 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.542798 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nls7t\" (UniqueName: \"kubernetes.io/projected/6d50a11f-90c8-490f-90a3-9fb2c14f2bea-kube-api-access-nls7t\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.892134 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" event={"ID":"6d50a11f-90c8-490f-90a3-9fb2c14f2bea","Type":"ContainerDied","Data":"815b29787e018145bea810aa9934065fc0aa3a3379567a1d5fe4810176b2b240"} Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.892213 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="815b29787e018145bea810aa9934065fc0aa3a3379567a1d5fe4810176b2b240" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.892177 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p46vh" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.980086 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5"] Feb 25 11:24:06 crc kubenswrapper[4725]: E0225 11:24:06.980682 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d50a11f-90c8-490f-90a3-9fb2c14f2bea" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.980757 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d50a11f-90c8-490f-90a3-9fb2c14f2bea" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 11:24:06 crc kubenswrapper[4725]: E0225 11:24:06.980815 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07212026-7350-41cd-8012-c427bd678f3c" containerName="oc" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.980887 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07212026-7350-41cd-8012-c427bd678f3c" containerName="oc" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.981169 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07212026-7350-41cd-8012-c427bd678f3c" containerName="oc" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.981252 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d50a11f-90c8-490f-90a3-9fb2c14f2bea" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.982043 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.984499 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.984673 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.984802 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.987774 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:24:06 crc kubenswrapper[4725]: I0225 11:24:06.995044 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5"] Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.051667 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x86v\" (UniqueName: \"kubernetes.io/projected/b40ab19d-a233-4263-b29f-390b5069752d-kube-api-access-9x86v\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.051898 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.052100 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.154249 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x86v\" (UniqueName: \"kubernetes.io/projected/b40ab19d-a233-4263-b29f-390b5069752d-kube-api-access-9x86v\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.154467 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.154606 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.159171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.161169 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.172380 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x86v\" (UniqueName: \"kubernetes.io/projected/b40ab19d-a233-4263-b29f-390b5069752d-kube-api-access-9x86v\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gznp5\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.237855 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a382086-c357-46af-83de-2b0e8cfeb4cc" path="/var/lib/kubelet/pods/4a382086-c357-46af-83de-2b0e8cfeb4cc/volumes" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.320296 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:07 crc kubenswrapper[4725]: I0225 11:24:07.975061 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5"] Feb 25 11:24:08 crc kubenswrapper[4725]: I0225 11:24:08.914648 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" event={"ID":"b40ab19d-a233-4263-b29f-390b5069752d","Type":"ContainerStarted","Data":"9f73e02d148f48ee0dfaf789ccf6e1b0fe7fe48cd2fbf77a55091c175bc11c9b"} Feb 25 11:24:08 crc kubenswrapper[4725]: I0225 11:24:08.914971 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" event={"ID":"b40ab19d-a233-4263-b29f-390b5069752d","Type":"ContainerStarted","Data":"abdba59ac19298581ddaf90b82cf0fffc240b3abbb75c322d3c85bb9383d50f8"} Feb 25 11:24:08 crc kubenswrapper[4725]: I0225 11:24:08.933279 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" podStartSLOduration=2.470850916 podStartE2EDuration="2.933253062s" podCreationTimestamp="2026-02-25 11:24:06 +0000 UTC" firstStartedPulling="2026-02-25 11:24:07.972784621 +0000 UTC m=+1873.471366646" lastFinishedPulling="2026-02-25 11:24:08.435186767 +0000 UTC m=+1873.933768792" observedRunningTime="2026-02-25 11:24:08.931610477 +0000 UTC m=+1874.430192512" watchObservedRunningTime="2026-02-25 11:24:08.933253062 +0000 UTC m=+1874.431835097" Feb 25 11:24:17 crc kubenswrapper[4725]: I0225 11:24:17.037068 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grc9g"] Feb 25 11:24:17 crc kubenswrapper[4725]: I0225 11:24:17.045021 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-grc9g"] Feb 25 11:24:17 crc kubenswrapper[4725]: I0225 11:24:17.236145 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757bb635-edf2-4081-a9a1-fdc66588e0aa" path="/var/lib/kubelet/pods/757bb635-edf2-4081-a9a1-fdc66588e0aa/volumes" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.345594 4725 scope.go:117] "RemoveContainer" containerID="524a8eb709969188a06d2a46537fa1ce142d16e12eb0251ee5ee507047c7cfed" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.372328 4725 scope.go:117] "RemoveContainer" containerID="b15d2fb9893207719ff5fbd160c98448cd731de7d9e7eecd467fafaf7e6d64fa" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.477587 4725 scope.go:117] "RemoveContainer" containerID="4f46516fbde8921f0c96488e4649d1fa7b06f5f8cc85d74725a31289b3e9529e" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.526502 4725 scope.go:117] "RemoveContainer" containerID="cddf54955eecc79b95cde8526782c391fc3cd16d299108255b1c7ab73d2c671c" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.569824 4725 scope.go:117] "RemoveContainer" containerID="c9bf8f113504797681a6e783d046c1d61dae84c398e454734223e6fd8ce114b7" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.625533 4725 scope.go:117] "RemoveContainer" containerID="f3ca06d08bb87840fc4bb186e58f411e36aa95e04d0577939884383a3f0b4967" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.645484 4725 scope.go:117] "RemoveContainer" containerID="0893fd2efeac20b904757819bda89379b6915590d061a7499d042f960b8c8660" Feb 25 11:24:19 crc kubenswrapper[4725]: I0225 11:24:19.670722 4725 scope.go:117] "RemoveContainer" containerID="32b877b788a257c358345907585b181f5cf485ba98b9ae4004658a98e3b3c183" Feb 25 11:24:21 crc kubenswrapper[4725]: I0225 11:24:21.224882 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:24:21 crc kubenswrapper[4725]: E0225 11:24:21.225335 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:24:27 crc kubenswrapper[4725]: I0225 11:24:27.800135 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8zzct"] Feb 25 11:24:27 crc kubenswrapper[4725]: I0225 11:24:27.802812 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:27 crc kubenswrapper[4725]: I0225 11:24:27.820050 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8zzct"] Feb 25 11:24:27 crc kubenswrapper[4725]: I0225 11:24:27.960393 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-catalog-content\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:27 crc kubenswrapper[4725]: I0225 11:24:27.960478 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lg9m\" (UniqueName: \"kubernetes.io/projected/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-kube-api-access-5lg9m\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:27 crc kubenswrapper[4725]: I0225 11:24:27.960536 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-utilities\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.062148 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-catalog-content\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.062209 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lg9m\" (UniqueName: \"kubernetes.io/projected/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-kube-api-access-5lg9m\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.062255 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-utilities\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.062718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-utilities\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.062999 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-catalog-content\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.085797 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lg9m\" (UniqueName: \"kubernetes.io/projected/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-kube-api-access-5lg9m\") pod \"certified-operators-8zzct\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.150695 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:28 crc kubenswrapper[4725]: I0225 11:24:28.619620 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8zzct"] Feb 25 11:24:29 crc kubenswrapper[4725]: E0225 11:24:29.069698 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cc6bef_0193_45d7_b4d3_783ba3ef603a.slice/crio-7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad.scope\": RecentStats: unable to find data in memory cache]" Feb 25 11:24:29 crc kubenswrapper[4725]: I0225 11:24:29.176412 4725 generic.go:334] "Generic (PLEG): container finished" podID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerID="7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad" exitCode=0 Feb 25 11:24:29 crc kubenswrapper[4725]: I0225 11:24:29.176462 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zzct" event={"ID":"b4cc6bef-0193-45d7-b4d3-783ba3ef603a","Type":"ContainerDied","Data":"7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad"} Feb 25 11:24:29 crc kubenswrapper[4725]: I0225 11:24:29.176489 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zzct" event={"ID":"b4cc6bef-0193-45d7-b4d3-783ba3ef603a","Type":"ContainerStarted","Data":"15615971eb66dce45f6b2d95722e796f8a96986cfc2dbf98da2ebc28bce29f11"} Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.204395 4725 generic.go:334] "Generic (PLEG): container finished" podID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerID="3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6" exitCode=0 Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.204476 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zzct" event={"ID":"b4cc6bef-0193-45d7-b4d3-783ba3ef603a","Type":"ContainerDied","Data":"3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6"} Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.783749 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2lbpw"] Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.786438 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.802680 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lbpw"] Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.936792 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-catalog-content\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.936856 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm9kn\" (UniqueName: \"kubernetes.io/projected/d3eb1067-5959-4677-970f-dccab81b334f-kube-api-access-qm9kn\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:31 crc kubenswrapper[4725]: I0225 11:24:31.936957 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-utilities\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.038045 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-catalog-content\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.038111 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm9kn\" (UniqueName: \"kubernetes.io/projected/d3eb1067-5959-4677-970f-dccab81b334f-kube-api-access-qm9kn\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.038229 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-utilities\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.038508 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-catalog-content\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.038627 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-utilities\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.062237 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm9kn\" (UniqueName: \"kubernetes.io/projected/d3eb1067-5959-4677-970f-dccab81b334f-kube-api-access-qm9kn\") pod \"redhat-operators-2lbpw\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.105734 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.225915 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zzct" event={"ID":"b4cc6bef-0193-45d7-b4d3-783ba3ef603a","Type":"ContainerStarted","Data":"a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c"} Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.251053 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8zzct" podStartSLOduration=2.817714383 podStartE2EDuration="5.251036816s" podCreationTimestamp="2026-02-25 11:24:27 +0000 UTC" firstStartedPulling="2026-02-25 11:24:29.178625556 +0000 UTC m=+1894.677207601" lastFinishedPulling="2026-02-25 11:24:31.611947989 +0000 UTC m=+1897.110530034" observedRunningTime="2026-02-25 11:24:32.245961083 +0000 UTC m=+1897.744543118" watchObservedRunningTime="2026-02-25 11:24:32.251036816 +0000 UTC m=+1897.749618841" Feb 25 11:24:32 crc kubenswrapper[4725]: W0225 11:24:32.568440 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3eb1067_5959_4677_970f_dccab81b334f.slice/crio-c730e0338e107616d7cf42773653907cb86bc43fb329155f2a4eb3ef9542d59b WatchSource:0}: Error finding container c730e0338e107616d7cf42773653907cb86bc43fb329155f2a4eb3ef9542d59b: Status 404 returned error can't find the container with id c730e0338e107616d7cf42773653907cb86bc43fb329155f2a4eb3ef9542d59b Feb 25 11:24:32 crc kubenswrapper[4725]: I0225 11:24:32.575457 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lbpw"] Feb 25 11:24:33 crc kubenswrapper[4725]: I0225 11:24:33.248609 4725 generic.go:334] "Generic (PLEG): container finished" podID="d3eb1067-5959-4677-970f-dccab81b334f" containerID="65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610" exitCode=0 Feb 25 11:24:33 crc kubenswrapper[4725]: I0225 11:24:33.249761 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbpw" event={"ID":"d3eb1067-5959-4677-970f-dccab81b334f","Type":"ContainerDied","Data":"65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610"} Feb 25 11:24:33 crc kubenswrapper[4725]: I0225 11:24:33.249811 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbpw" event={"ID":"d3eb1067-5959-4677-970f-dccab81b334f","Type":"ContainerStarted","Data":"c730e0338e107616d7cf42773653907cb86bc43fb329155f2a4eb3ef9542d59b"} Feb 25 11:24:34 crc kubenswrapper[4725]: I0225 11:24:34.224195 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:24:34 crc kubenswrapper[4725]: E0225 11:24:34.225191 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:24:35 crc kubenswrapper[4725]: I0225 11:24:35.051891 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7xdft"] Feb 25 11:24:35 crc kubenswrapper[4725]: I0225 11:24:35.070706 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7xdft"] Feb 25 11:24:35 crc kubenswrapper[4725]: I0225 11:24:35.239771 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6ec572-732a-4118-bbd3-88295c5173da" path="/var/lib/kubelet/pods/1d6ec572-732a-4118-bbd3-88295c5173da/volumes" Feb 25 11:24:35 crc kubenswrapper[4725]: I0225 11:24:35.267536 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbpw" event={"ID":"d3eb1067-5959-4677-970f-dccab81b334f","Type":"ContainerStarted","Data":"ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76"} Feb 25 11:24:36 crc kubenswrapper[4725]: I0225 11:24:36.048728 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zphmq"] Feb 25 11:24:36 crc kubenswrapper[4725]: I0225 11:24:36.064386 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zphmq"] Feb 25 11:24:37 crc kubenswrapper[4725]: I0225 11:24:37.237124 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9253f776-9f91-4908-95a0-1f495326291d" path="/var/lib/kubelet/pods/9253f776-9f91-4908-95a0-1f495326291d/volumes" Feb 25 11:24:37 crc kubenswrapper[4725]: I0225 11:24:37.297469 4725 generic.go:334] "Generic (PLEG): container finished" podID="d3eb1067-5959-4677-970f-dccab81b334f" containerID="ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76" exitCode=0 Feb 25 11:24:37 crc kubenswrapper[4725]: I0225 11:24:37.297534 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbpw" event={"ID":"d3eb1067-5959-4677-970f-dccab81b334f","Type":"ContainerDied","Data":"ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76"} Feb 25 11:24:38 crc kubenswrapper[4725]: I0225 11:24:38.151643 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:38 crc kubenswrapper[4725]: I0225 11:24:38.151997 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:38 crc kubenswrapper[4725]: I0225 11:24:38.206042 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:38 crc kubenswrapper[4725]: I0225 11:24:38.348727 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:39 crc kubenswrapper[4725]: I0225 11:24:39.315274 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbpw" event={"ID":"d3eb1067-5959-4677-970f-dccab81b334f","Type":"ContainerStarted","Data":"9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b"} Feb 25 11:24:39 crc kubenswrapper[4725]: I0225 11:24:39.348031 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2lbpw" podStartSLOduration=2.716304702 podStartE2EDuration="8.348006497s" podCreationTimestamp="2026-02-25 11:24:31 +0000 UTC" firstStartedPulling="2026-02-25 11:24:33.255884533 +0000 UTC m=+1898.754466558" lastFinishedPulling="2026-02-25 11:24:38.887586318 +0000 UTC m=+1904.386168353" observedRunningTime="2026-02-25 11:24:39.340743576 +0000 UTC m=+1904.839325621" watchObservedRunningTime="2026-02-25 11:24:39.348006497 +0000 UTC m=+1904.846588522" Feb 25 11:24:40 crc kubenswrapper[4725]: I0225 11:24:40.570559 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8zzct"] Feb 25 11:24:40 crc kubenswrapper[4725]: I0225 11:24:40.571089 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8zzct" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="registry-server" containerID="cri-o://a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c" gracePeriod=2 Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.064776 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.156999 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-utilities\") pod \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.157055 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-catalog-content\") pod \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.157193 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lg9m\" (UniqueName: \"kubernetes.io/projected/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-kube-api-access-5lg9m\") pod \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\" (UID: \"b4cc6bef-0193-45d7-b4d3-783ba3ef603a\") " Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.158201 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-utilities" (OuterVolumeSpecName: "utilities") pod "b4cc6bef-0193-45d7-b4d3-783ba3ef603a" (UID: "b4cc6bef-0193-45d7-b4d3-783ba3ef603a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.163793 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-kube-api-access-5lg9m" (OuterVolumeSpecName: "kube-api-access-5lg9m") pod "b4cc6bef-0193-45d7-b4d3-783ba3ef603a" (UID: "b4cc6bef-0193-45d7-b4d3-783ba3ef603a"). InnerVolumeSpecName "kube-api-access-5lg9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.233514 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4cc6bef-0193-45d7-b4d3-783ba3ef603a" (UID: "b4cc6bef-0193-45d7-b4d3-783ba3ef603a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.259123 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.259168 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.259182 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lg9m\" (UniqueName: \"kubernetes.io/projected/b4cc6bef-0193-45d7-b4d3-783ba3ef603a-kube-api-access-5lg9m\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.336315 4725 generic.go:334] "Generic (PLEG): container finished" podID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerID="a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c" exitCode=0 Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.336387 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zzct" event={"ID":"b4cc6bef-0193-45d7-b4d3-783ba3ef603a","Type":"ContainerDied","Data":"a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c"} Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.336424 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zzct" event={"ID":"b4cc6bef-0193-45d7-b4d3-783ba3ef603a","Type":"ContainerDied","Data":"15615971eb66dce45f6b2d95722e796f8a96986cfc2dbf98da2ebc28bce29f11"} Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.336448 4725 scope.go:117] "RemoveContainer" containerID="a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.336643 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zzct" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.367566 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8zzct"] Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.371447 4725 scope.go:117] "RemoveContainer" containerID="3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.376895 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8zzct"] Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.395033 4725 scope.go:117] "RemoveContainer" containerID="7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.452969 4725 scope.go:117] "RemoveContainer" containerID="a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c" Feb 25 11:24:41 crc kubenswrapper[4725]: E0225 11:24:41.453445 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c\": container with ID starting with a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c not found: ID does not exist" containerID="a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.453481 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c"} err="failed to get container status \"a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c\": rpc error: code = NotFound desc = could not find container \"a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c\": container with ID starting with a643d0f54ce4ff0bfb1ce911fe07ad38c3c1c8783ec80f3fdf3f092f7d94a73c not found: ID does not exist" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.453506 4725 scope.go:117] "RemoveContainer" containerID="3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6" Feb 25 11:24:41 crc kubenswrapper[4725]: E0225 11:24:41.453995 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6\": container with ID starting with 3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6 not found: ID does not exist" containerID="3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.454037 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6"} err="failed to get container status \"3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6\": rpc error: code = NotFound desc = could not find container \"3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6\": container with ID starting with 3a719b85d1dc372f69bebda5f5ee0253bb1b0c12e51420e71999fb6113ba55a6 not found: ID does not exist" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.454063 4725 scope.go:117] "RemoveContainer" containerID="7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad" Feb 25 11:24:41 crc kubenswrapper[4725]: E0225 11:24:41.454466 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad\": container with ID starting with 7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad not found: ID does not exist" containerID="7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad" Feb 25 11:24:41 crc kubenswrapper[4725]: I0225 11:24:41.454510 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad"} err="failed to get container status \"7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad\": rpc error: code = NotFound desc = could not find container \"7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad\": container with ID starting with 7c6e21393e7bfc288335376143812a6d0652d5be3646b382939971a5ac9656ad not found: ID does not exist" Feb 25 11:24:42 crc kubenswrapper[4725]: I0225 11:24:42.106495 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:42 crc kubenswrapper[4725]: I0225 11:24:42.106903 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:43 crc kubenswrapper[4725]: I0225 11:24:43.157767 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2lbpw" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="registry-server" probeResult="failure" output=< Feb 25 11:24:43 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 25 11:24:43 crc kubenswrapper[4725]: > Feb 25 11:24:43 crc kubenswrapper[4725]: I0225 11:24:43.238279 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" path="/var/lib/kubelet/pods/b4cc6bef-0193-45d7-b4d3-783ba3ef603a/volumes" Feb 25 11:24:46 crc kubenswrapper[4725]: I0225 11:24:46.224298 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:24:46 crc kubenswrapper[4725]: E0225 11:24:46.224788 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:24:47 crc kubenswrapper[4725]: I0225 11:24:47.393740 4725 generic.go:334] "Generic (PLEG): container finished" podID="b40ab19d-a233-4263-b29f-390b5069752d" containerID="9f73e02d148f48ee0dfaf789ccf6e1b0fe7fe48cd2fbf77a55091c175bc11c9b" exitCode=0 Feb 25 11:24:47 crc kubenswrapper[4725]: I0225 11:24:47.393808 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" event={"ID":"b40ab19d-a233-4263-b29f-390b5069752d","Type":"ContainerDied","Data":"9f73e02d148f48ee0dfaf789ccf6e1b0fe7fe48cd2fbf77a55091c175bc11c9b"} Feb 25 11:24:48 crc kubenswrapper[4725]: I0225 11:24:48.832457 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.014724 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x86v\" (UniqueName: \"kubernetes.io/projected/b40ab19d-a233-4263-b29f-390b5069752d-kube-api-access-9x86v\") pod \"b40ab19d-a233-4263-b29f-390b5069752d\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.015313 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-ssh-key-openstack-edpm-ipam\") pod \"b40ab19d-a233-4263-b29f-390b5069752d\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.015358 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-inventory\") pod \"b40ab19d-a233-4263-b29f-390b5069752d\" (UID: \"b40ab19d-a233-4263-b29f-390b5069752d\") " Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.023343 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40ab19d-a233-4263-b29f-390b5069752d-kube-api-access-9x86v" (OuterVolumeSpecName: "kube-api-access-9x86v") pod "b40ab19d-a233-4263-b29f-390b5069752d" (UID: "b40ab19d-a233-4263-b29f-390b5069752d"). InnerVolumeSpecName "kube-api-access-9x86v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.065570 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b40ab19d-a233-4263-b29f-390b5069752d" (UID: "b40ab19d-a233-4263-b29f-390b5069752d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.066696 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-inventory" (OuterVolumeSpecName: "inventory") pod "b40ab19d-a233-4263-b29f-390b5069752d" (UID: "b40ab19d-a233-4263-b29f-390b5069752d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.118509 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.118569 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b40ab19d-a233-4263-b29f-390b5069752d-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.118596 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x86v\" (UniqueName: \"kubernetes.io/projected/b40ab19d-a233-4263-b29f-390b5069752d-kube-api-access-9x86v\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.415152 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" event={"ID":"b40ab19d-a233-4263-b29f-390b5069752d","Type":"ContainerDied","Data":"abdba59ac19298581ddaf90b82cf0fffc240b3abbb75c322d3c85bb9383d50f8"} Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.415190 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abdba59ac19298581ddaf90b82cf0fffc240b3abbb75c322d3c85bb9383d50f8" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.415258 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gznp5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.539983 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5"] Feb 25 11:24:49 crc kubenswrapper[4725]: E0225 11:24:49.540463 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="extract-content" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.540491 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="extract-content" Feb 25 11:24:49 crc kubenswrapper[4725]: E0225 11:24:49.540513 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="registry-server" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.540524 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="registry-server" Feb 25 11:24:49 crc kubenswrapper[4725]: E0225 11:24:49.540550 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40ab19d-a233-4263-b29f-390b5069752d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.540562 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40ab19d-a233-4263-b29f-390b5069752d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:24:49 crc kubenswrapper[4725]: E0225 11:24:49.540586 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="extract-utilities" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.540597 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="extract-utilities" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.540815 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cc6bef-0193-45d7-b4d3-783ba3ef603a" containerName="registry-server" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.540875 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40ab19d-a233-4263-b29f-390b5069752d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.541681 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.551431 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5"] Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.551462 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.551565 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.551612 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.552078 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.730863 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.730930 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.731062 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4clkw\" (UniqueName: \"kubernetes.io/projected/3118e370-4c72-4fc4-bf2b-d27645473666-kube-api-access-4clkw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.832662 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4clkw\" (UniqueName: \"kubernetes.io/projected/3118e370-4c72-4fc4-bf2b-d27645473666-kube-api-access-4clkw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.833154 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.833197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.839438 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.839471 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.859408 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4clkw\" (UniqueName: \"kubernetes.io/projected/3118e370-4c72-4fc4-bf2b-d27645473666-kube-api-access-4clkw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:49 crc kubenswrapper[4725]: I0225 11:24:49.866010 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:24:50 crc kubenswrapper[4725]: I0225 11:24:50.438998 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5"] Feb 25 11:24:51 crc kubenswrapper[4725]: I0225 11:24:51.467866 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" event={"ID":"3118e370-4c72-4fc4-bf2b-d27645473666","Type":"ContainerStarted","Data":"4a6894093b39a523f28952ca4e20ba0ee3f455d50bc17b16198f75de198ea21b"} Feb 25 11:24:51 crc kubenswrapper[4725]: I0225 11:24:51.467930 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" event={"ID":"3118e370-4c72-4fc4-bf2b-d27645473666","Type":"ContainerStarted","Data":"0602e7b6b0637bcaceb93c5b2f4145a4b26d83999452ee5cc720d85c288eff34"} Feb 25 11:24:51 crc kubenswrapper[4725]: I0225 11:24:51.494158 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" podStartSLOduration=1.8482880000000002 podStartE2EDuration="2.494141305s" podCreationTimestamp="2026-02-25 11:24:49 +0000 UTC" firstStartedPulling="2026-02-25 11:24:50.451963207 +0000 UTC m=+1915.950545262" lastFinishedPulling="2026-02-25 11:24:51.097816532 +0000 UTC m=+1916.596398567" observedRunningTime="2026-02-25 11:24:51.486705539 +0000 UTC m=+1916.985287604" watchObservedRunningTime="2026-02-25 11:24:51.494141305 +0000 UTC m=+1916.992723330" Feb 25 11:24:52 crc kubenswrapper[4725]: I0225 11:24:52.178535 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:52 crc kubenswrapper[4725]: I0225 11:24:52.274402 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:52 crc kubenswrapper[4725]: I0225 11:24:52.418259 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lbpw"] Feb 25 11:24:53 crc kubenswrapper[4725]: I0225 11:24:53.500333 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2lbpw" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="registry-server" containerID="cri-o://9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b" gracePeriod=2 Feb 25 11:24:53 crc kubenswrapper[4725]: I0225 11:24:53.993550 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.120358 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-catalog-content\") pod \"d3eb1067-5959-4677-970f-dccab81b334f\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.120664 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-utilities\") pod \"d3eb1067-5959-4677-970f-dccab81b334f\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.120946 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm9kn\" (UniqueName: \"kubernetes.io/projected/d3eb1067-5959-4677-970f-dccab81b334f-kube-api-access-qm9kn\") pod \"d3eb1067-5959-4677-970f-dccab81b334f\" (UID: \"d3eb1067-5959-4677-970f-dccab81b334f\") " Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.121396 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-utilities" (OuterVolumeSpecName: "utilities") pod "d3eb1067-5959-4677-970f-dccab81b334f" (UID: "d3eb1067-5959-4677-970f-dccab81b334f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.121682 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.126544 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3eb1067-5959-4677-970f-dccab81b334f-kube-api-access-qm9kn" (OuterVolumeSpecName: "kube-api-access-qm9kn") pod "d3eb1067-5959-4677-970f-dccab81b334f" (UID: "d3eb1067-5959-4677-970f-dccab81b334f"). InnerVolumeSpecName "kube-api-access-qm9kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.224109 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm9kn\" (UniqueName: \"kubernetes.io/projected/d3eb1067-5959-4677-970f-dccab81b334f-kube-api-access-qm9kn\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.257111 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3eb1067-5959-4677-970f-dccab81b334f" (UID: "d3eb1067-5959-4677-970f-dccab81b334f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.327134 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb1067-5959-4677-970f-dccab81b334f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.514106 4725 generic.go:334] "Generic (PLEG): container finished" podID="d3eb1067-5959-4677-970f-dccab81b334f" containerID="9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b" exitCode=0 Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.514151 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbpw" event={"ID":"d3eb1067-5959-4677-970f-dccab81b334f","Type":"ContainerDied","Data":"9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b"} Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.514179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lbpw" event={"ID":"d3eb1067-5959-4677-970f-dccab81b334f","Type":"ContainerDied","Data":"c730e0338e107616d7cf42773653907cb86bc43fb329155f2a4eb3ef9542d59b"} Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.514187 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lbpw" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.514197 4725 scope.go:117] "RemoveContainer" containerID="9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.534204 4725 scope.go:117] "RemoveContainer" containerID="ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.568220 4725 scope.go:117] "RemoveContainer" containerID="65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.569993 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lbpw"] Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.581259 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2lbpw"] Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.607226 4725 scope.go:117] "RemoveContainer" containerID="9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b" Feb 25 11:24:54 crc kubenswrapper[4725]: E0225 11:24:54.607701 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b\": container with ID starting with 9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b not found: ID does not exist" containerID="9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.607742 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b"} err="failed to get container status \"9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b\": rpc error: code = NotFound desc = could not find container \"9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b\": container with ID starting with 9497b5cc0a73de1491f524987d3ba77fc2d5769a99d3cb59a005f6385d061d9b not found: ID does not exist" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.607768 4725 scope.go:117] "RemoveContainer" containerID="ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76" Feb 25 11:24:54 crc kubenswrapper[4725]: E0225 11:24:54.608118 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76\": container with ID starting with ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76 not found: ID does not exist" containerID="ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.608145 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76"} err="failed to get container status \"ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76\": rpc error: code = NotFound desc = could not find container \"ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76\": container with ID starting with ac50727b8e38706fa503b80b60245688baaf811c6e86359e581b8f2275673d76 not found: ID does not exist" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.608159 4725 scope.go:117] "RemoveContainer" containerID="65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610" Feb 25 11:24:54 crc kubenswrapper[4725]: E0225 11:24:54.608380 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610\": container with ID starting with 65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610 not found: ID does not exist" containerID="65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610" Feb 25 11:24:54 crc kubenswrapper[4725]: I0225 11:24:54.608409 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610"} err="failed to get container status \"65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610\": rpc error: code = NotFound desc = could not find container \"65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610\": container with ID starting with 65b23a9a21df9dbeaa869f2e187de422b15f57dadefb4d1ceb102edcbb0fd610 not found: ID does not exist" Feb 25 11:24:55 crc kubenswrapper[4725]: I0225 11:24:55.239399 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3eb1067-5959-4677-970f-dccab81b334f" path="/var/lib/kubelet/pods/d3eb1067-5959-4677-970f-dccab81b334f/volumes" Feb 25 11:25:00 crc kubenswrapper[4725]: I0225 11:25:00.224665 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:25:00 crc kubenswrapper[4725]: E0225 11:25:00.225419 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:25:11 crc kubenswrapper[4725]: I0225 11:25:11.224998 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:25:11 crc kubenswrapper[4725]: E0225 11:25:11.225976 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:25:18 crc kubenswrapper[4725]: I0225 11:25:18.051981 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6jpc"] Feb 25 11:25:18 crc kubenswrapper[4725]: I0225 11:25:18.062950 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-v6jpc"] Feb 25 11:25:19 crc kubenswrapper[4725]: I0225 11:25:19.246372 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3538d74b-8967-41d6-b4a4-add6bf1558ad" path="/var/lib/kubelet/pods/3538d74b-8967-41d6-b4a4-add6bf1558ad/volumes" Feb 25 11:25:19 crc kubenswrapper[4725]: I0225 11:25:19.814877 4725 scope.go:117] "RemoveContainer" containerID="ce808751965ad771f7787ed4ef2630ae08dddd816d343a30dd7d728b22f88733" Feb 25 11:25:19 crc kubenswrapper[4725]: I0225 11:25:19.876418 4725 scope.go:117] "RemoveContainer" containerID="122fa33a6fb946db18a1343242130a1b974362a2da0c3352409e5fcd2a6858f2" Feb 25 11:25:19 crc kubenswrapper[4725]: I0225 11:25:19.917018 4725 scope.go:117] "RemoveContainer" containerID="03d3d725bf3dc66aced72b53625686ca64039eba98fc55c862ea094d7439551e" Feb 25 11:25:26 crc kubenswrapper[4725]: I0225 11:25:26.225463 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:25:26 crc kubenswrapper[4725]: I0225 11:25:26.880648 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"add76c268fa48b85bd8b4a73353a88415ac719328ee98d349951379413d37c8f"} Feb 25 11:25:44 crc kubenswrapper[4725]: I0225 11:25:44.061098 4725 generic.go:334] "Generic (PLEG): container finished" podID="3118e370-4c72-4fc4-bf2b-d27645473666" containerID="4a6894093b39a523f28952ca4e20ba0ee3f455d50bc17b16198f75de198ea21b" exitCode=0 Feb 25 11:25:44 crc kubenswrapper[4725]: I0225 11:25:44.061171 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" event={"ID":"3118e370-4c72-4fc4-bf2b-d27645473666","Type":"ContainerDied","Data":"4a6894093b39a523f28952ca4e20ba0ee3f455d50bc17b16198f75de198ea21b"} Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.540706 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.578133 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-ssh-key-openstack-edpm-ipam\") pod \"3118e370-4c72-4fc4-bf2b-d27645473666\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.578179 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-inventory\") pod \"3118e370-4c72-4fc4-bf2b-d27645473666\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.578210 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4clkw\" (UniqueName: \"kubernetes.io/projected/3118e370-4c72-4fc4-bf2b-d27645473666-kube-api-access-4clkw\") pod \"3118e370-4c72-4fc4-bf2b-d27645473666\" (UID: \"3118e370-4c72-4fc4-bf2b-d27645473666\") " Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.600220 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3118e370-4c72-4fc4-bf2b-d27645473666-kube-api-access-4clkw" (OuterVolumeSpecName: "kube-api-access-4clkw") pod "3118e370-4c72-4fc4-bf2b-d27645473666" (UID: "3118e370-4c72-4fc4-bf2b-d27645473666"). InnerVolumeSpecName "kube-api-access-4clkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.617176 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3118e370-4c72-4fc4-bf2b-d27645473666" (UID: "3118e370-4c72-4fc4-bf2b-d27645473666"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.633536 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-inventory" (OuterVolumeSpecName: "inventory") pod "3118e370-4c72-4fc4-bf2b-d27645473666" (UID: "3118e370-4c72-4fc4-bf2b-d27645473666"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.680014 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.680044 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3118e370-4c72-4fc4-bf2b-d27645473666-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:25:45 crc kubenswrapper[4725]: I0225 11:25:45.680053 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4clkw\" (UniqueName: \"kubernetes.io/projected/3118e370-4c72-4fc4-bf2b-d27645473666-kube-api-access-4clkw\") on node \"crc\" DevicePath \"\"" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.079497 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" event={"ID":"3118e370-4c72-4fc4-bf2b-d27645473666","Type":"ContainerDied","Data":"0602e7b6b0637bcaceb93c5b2f4145a4b26d83999452ee5cc720d85c288eff34"} Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.079541 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0602e7b6b0637bcaceb93c5b2f4145a4b26d83999452ee5cc720d85c288eff34" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.079622 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.190899 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rfwzw"] Feb 25 11:25:46 crc kubenswrapper[4725]: E0225 11:25:46.191352 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="extract-content" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.191374 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="extract-content" Feb 25 11:25:46 crc kubenswrapper[4725]: E0225 11:25:46.191396 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3118e370-4c72-4fc4-bf2b-d27645473666" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.191407 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3118e370-4c72-4fc4-bf2b-d27645473666" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:25:46 crc kubenswrapper[4725]: E0225 11:25:46.191432 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="registry-server" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.191440 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="registry-server" Feb 25 11:25:46 crc kubenswrapper[4725]: E0225 11:25:46.191459 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="extract-utilities" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.191467 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="extract-utilities" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.191693 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eb1067-5959-4677-970f-dccab81b334f" containerName="registry-server" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.191719 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3118e370-4c72-4fc4-bf2b-d27645473666" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.192440 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.194894 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.194924 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.195497 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.197071 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.202588 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rfwzw"] Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.298986 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdgnl\" (UniqueName: \"kubernetes.io/projected/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-kube-api-access-xdgnl\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.299079 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.299135 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.402564 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.402639 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.402793 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdgnl\" (UniqueName: \"kubernetes.io/projected/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-kube-api-access-xdgnl\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.408957 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.409644 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.426916 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdgnl\" (UniqueName: \"kubernetes.io/projected/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-kube-api-access-xdgnl\") pod \"ssh-known-hosts-edpm-deployment-rfwzw\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:46 crc kubenswrapper[4725]: I0225 11:25:46.520497 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:47 crc kubenswrapper[4725]: I0225 11:25:47.120772 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rfwzw"] Feb 25 11:25:48 crc kubenswrapper[4725]: I0225 11:25:48.118102 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" event={"ID":"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d","Type":"ContainerStarted","Data":"619ca4e12f671c2910a4eb02bb4b7ab6428e1d910298862142a90e6aef3f9bb3"} Feb 25 11:25:48 crc kubenswrapper[4725]: I0225 11:25:48.118889 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" event={"ID":"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d","Type":"ContainerStarted","Data":"21d9016ee3a9f0a728bb69b63426552979d949c1cbb6c9a295d649ceed276a12"} Feb 25 11:25:48 crc kubenswrapper[4725]: I0225 11:25:48.149465 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" podStartSLOduration=1.693507994 podStartE2EDuration="2.149442475s" podCreationTimestamp="2026-02-25 11:25:46 +0000 UTC" firstStartedPulling="2026-02-25 11:25:47.11158835 +0000 UTC m=+1972.610170375" lastFinishedPulling="2026-02-25 11:25:47.567522791 +0000 UTC m=+1973.066104856" observedRunningTime="2026-02-25 11:25:48.145724317 +0000 UTC m=+1973.644306372" watchObservedRunningTime="2026-02-25 11:25:48.149442475 +0000 UTC m=+1973.648024510" Feb 25 11:25:55 crc kubenswrapper[4725]: I0225 11:25:55.196567 4725 generic.go:334] "Generic (PLEG): container finished" podID="f5f7958b-17b2-40ba-a17b-bc8eefa6d59d" containerID="619ca4e12f671c2910a4eb02bb4b7ab6428e1d910298862142a90e6aef3f9bb3" exitCode=0 Feb 25 11:25:55 crc kubenswrapper[4725]: I0225 11:25:55.196724 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" event={"ID":"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d","Type":"ContainerDied","Data":"619ca4e12f671c2910a4eb02bb4b7ab6428e1d910298862142a90e6aef3f9bb3"} Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.696558 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.738044 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-ssh-key-openstack-edpm-ipam\") pod \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.738144 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdgnl\" (UniqueName: \"kubernetes.io/projected/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-kube-api-access-xdgnl\") pod \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.738212 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-inventory-0\") pod \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\" (UID: \"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d\") " Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.761901 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-kube-api-access-xdgnl" (OuterVolumeSpecName: "kube-api-access-xdgnl") pod "f5f7958b-17b2-40ba-a17b-bc8eefa6d59d" (UID: "f5f7958b-17b2-40ba-a17b-bc8eefa6d59d"). InnerVolumeSpecName "kube-api-access-xdgnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.787209 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5f7958b-17b2-40ba-a17b-bc8eefa6d59d" (UID: "f5f7958b-17b2-40ba-a17b-bc8eefa6d59d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.798850 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f5f7958b-17b2-40ba-a17b-bc8eefa6d59d" (UID: "f5f7958b-17b2-40ba-a17b-bc8eefa6d59d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.840083 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.840115 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdgnl\" (UniqueName: \"kubernetes.io/projected/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-kube-api-access-xdgnl\") on node \"crc\" DevicePath \"\"" Feb 25 11:25:56 crc kubenswrapper[4725]: I0225 11:25:56.840124 4725 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f5f7958b-17b2-40ba-a17b-bc8eefa6d59d-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.224067 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.239372 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rfwzw" event={"ID":"f5f7958b-17b2-40ba-a17b-bc8eefa6d59d","Type":"ContainerDied","Data":"21d9016ee3a9f0a728bb69b63426552979d949c1cbb6c9a295d649ceed276a12"} Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.239730 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21d9016ee3a9f0a728bb69b63426552979d949c1cbb6c9a295d649ceed276a12" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.314336 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9"] Feb 25 11:25:57 crc kubenswrapper[4725]: E0225 11:25:57.314993 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f7958b-17b2-40ba-a17b-bc8eefa6d59d" containerName="ssh-known-hosts-edpm-deployment" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.315020 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f7958b-17b2-40ba-a17b-bc8eefa6d59d" containerName="ssh-known-hosts-edpm-deployment" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.315339 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f7958b-17b2-40ba-a17b-bc8eefa6d59d" containerName="ssh-known-hosts-edpm-deployment" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.316451 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.319729 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.319762 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.320356 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.323225 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9"] Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.334184 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.351775 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.351950 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.352195 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm5zx\" (UniqueName: \"kubernetes.io/projected/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-kube-api-access-gm5zx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.453197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm5zx\" (UniqueName: \"kubernetes.io/projected/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-kube-api-access-gm5zx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.453267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.453327 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.462406 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.465388 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.487207 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm5zx\" (UniqueName: \"kubernetes.io/projected/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-kube-api-access-gm5zx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gx8f9\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:57 crc kubenswrapper[4725]: I0225 11:25:57.652887 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:25:58 crc kubenswrapper[4725]: I0225 11:25:58.234857 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9"] Feb 25 11:25:59 crc kubenswrapper[4725]: I0225 11:25:59.248907 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" event={"ID":"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8","Type":"ContainerStarted","Data":"6a75432509affadfb3d85a860bce3a4805c79ab4fbc666edf7f8d9a62d808e84"} Feb 25 11:25:59 crc kubenswrapper[4725]: I0225 11:25:59.249268 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" event={"ID":"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8","Type":"ContainerStarted","Data":"d0f56efec895f19033b3d0404905f401efbd04e94d53941a31860911ef525b85"} Feb 25 11:25:59 crc kubenswrapper[4725]: I0225 11:25:59.273595 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" podStartSLOduration=1.828487001 podStartE2EDuration="2.273558706s" podCreationTimestamp="2026-02-25 11:25:57 +0000 UTC" firstStartedPulling="2026-02-25 11:25:58.250926552 +0000 UTC m=+1983.749508587" lastFinishedPulling="2026-02-25 11:25:58.695998227 +0000 UTC m=+1984.194580292" observedRunningTime="2026-02-25 11:25:59.268874373 +0000 UTC m=+1984.767456438" watchObservedRunningTime="2026-02-25 11:25:59.273558706 +0000 UTC m=+1984.772140741" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.151284 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533646-rtfb8"] Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.153349 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533646-rtfb8" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.157360 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.159171 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.159417 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.170749 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533646-rtfb8"] Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.218134 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg4tq\" (UniqueName: \"kubernetes.io/projected/4cd0fea4-28ef-4a6e-8b5b-7d137842723d-kube-api-access-jg4tq\") pod \"auto-csr-approver-29533646-rtfb8\" (UID: \"4cd0fea4-28ef-4a6e-8b5b-7d137842723d\") " pod="openshift-infra/auto-csr-approver-29533646-rtfb8" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.320855 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg4tq\" (UniqueName: \"kubernetes.io/projected/4cd0fea4-28ef-4a6e-8b5b-7d137842723d-kube-api-access-jg4tq\") pod \"auto-csr-approver-29533646-rtfb8\" (UID: \"4cd0fea4-28ef-4a6e-8b5b-7d137842723d\") " pod="openshift-infra/auto-csr-approver-29533646-rtfb8" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.339731 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg4tq\" (UniqueName: \"kubernetes.io/projected/4cd0fea4-28ef-4a6e-8b5b-7d137842723d-kube-api-access-jg4tq\") pod \"auto-csr-approver-29533646-rtfb8\" (UID: \"4cd0fea4-28ef-4a6e-8b5b-7d137842723d\") " pod="openshift-infra/auto-csr-approver-29533646-rtfb8" Feb 25 11:26:00 crc kubenswrapper[4725]: I0225 11:26:00.507631 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533646-rtfb8" Feb 25 11:26:01 crc kubenswrapper[4725]: I0225 11:26:01.006214 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533646-rtfb8"] Feb 25 11:26:01 crc kubenswrapper[4725]: W0225 11:26:01.007857 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cd0fea4_28ef_4a6e_8b5b_7d137842723d.slice/crio-d7387fa55eba5e0569aa501965ba5d7ffd73950d160c3f2e0b7f3f80f6da2935 WatchSource:0}: Error finding container d7387fa55eba5e0569aa501965ba5d7ffd73950d160c3f2e0b7f3f80f6da2935: Status 404 returned error can't find the container with id d7387fa55eba5e0569aa501965ba5d7ffd73950d160c3f2e0b7f3f80f6da2935 Feb 25 11:26:01 crc kubenswrapper[4725]: I0225 11:26:01.271432 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533646-rtfb8" event={"ID":"4cd0fea4-28ef-4a6e-8b5b-7d137842723d","Type":"ContainerStarted","Data":"d7387fa55eba5e0569aa501965ba5d7ffd73950d160c3f2e0b7f3f80f6da2935"} Feb 25 11:26:03 crc kubenswrapper[4725]: I0225 11:26:03.294094 4725 generic.go:334] "Generic (PLEG): container finished" podID="4cd0fea4-28ef-4a6e-8b5b-7d137842723d" containerID="88dae455f946b82bc12405b15fe9b7803b8d1088a3c30cb18c6e6f9f926dfa11" exitCode=0 Feb 25 11:26:03 crc kubenswrapper[4725]: I0225 11:26:03.294259 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533646-rtfb8" event={"ID":"4cd0fea4-28ef-4a6e-8b5b-7d137842723d","Type":"ContainerDied","Data":"88dae455f946b82bc12405b15fe9b7803b8d1088a3c30cb18c6e6f9f926dfa11"} Feb 25 11:26:04 crc kubenswrapper[4725]: I0225 11:26:04.701740 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533646-rtfb8" Feb 25 11:26:04 crc kubenswrapper[4725]: I0225 11:26:04.710936 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg4tq\" (UniqueName: \"kubernetes.io/projected/4cd0fea4-28ef-4a6e-8b5b-7d137842723d-kube-api-access-jg4tq\") pod \"4cd0fea4-28ef-4a6e-8b5b-7d137842723d\" (UID: \"4cd0fea4-28ef-4a6e-8b5b-7d137842723d\") " Feb 25 11:26:04 crc kubenswrapper[4725]: I0225 11:26:04.717191 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd0fea4-28ef-4a6e-8b5b-7d137842723d-kube-api-access-jg4tq" (OuterVolumeSpecName: "kube-api-access-jg4tq") pod "4cd0fea4-28ef-4a6e-8b5b-7d137842723d" (UID: "4cd0fea4-28ef-4a6e-8b5b-7d137842723d"). InnerVolumeSpecName "kube-api-access-jg4tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:26:04 crc kubenswrapper[4725]: I0225 11:26:04.813814 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg4tq\" (UniqueName: \"kubernetes.io/projected/4cd0fea4-28ef-4a6e-8b5b-7d137842723d-kube-api-access-jg4tq\") on node \"crc\" DevicePath \"\"" Feb 25 11:26:05 crc kubenswrapper[4725]: I0225 11:26:05.319853 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533646-rtfb8" event={"ID":"4cd0fea4-28ef-4a6e-8b5b-7d137842723d","Type":"ContainerDied","Data":"d7387fa55eba5e0569aa501965ba5d7ffd73950d160c3f2e0b7f3f80f6da2935"} Feb 25 11:26:05 crc kubenswrapper[4725]: I0225 11:26:05.319894 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7387fa55eba5e0569aa501965ba5d7ffd73950d160c3f2e0b7f3f80f6da2935" Feb 25 11:26:05 crc kubenswrapper[4725]: I0225 11:26:05.319992 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533646-rtfb8" Feb 25 11:26:05 crc kubenswrapper[4725]: I0225 11:26:05.791800 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533640-q8ps4"] Feb 25 11:26:05 crc kubenswrapper[4725]: I0225 11:26:05.804189 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533640-q8ps4"] Feb 25 11:26:07 crc kubenswrapper[4725]: I0225 11:26:07.240566 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d" path="/var/lib/kubelet/pods/4f45ceeb-b1cb-49af-b7ec-7f3a9e85c89d/volumes" Feb 25 11:26:07 crc kubenswrapper[4725]: I0225 11:26:07.356812 4725 generic.go:334] "Generic (PLEG): container finished" podID="6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8" containerID="6a75432509affadfb3d85a860bce3a4805c79ab4fbc666edf7f8d9a62d808e84" exitCode=0 Feb 25 11:26:07 crc kubenswrapper[4725]: I0225 11:26:07.356936 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" event={"ID":"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8","Type":"ContainerDied","Data":"6a75432509affadfb3d85a860bce3a4805c79ab4fbc666edf7f8d9a62d808e84"} Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.769439 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.889275 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-ssh-key-openstack-edpm-ipam\") pod \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.889376 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-inventory\") pod \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.889575 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm5zx\" (UniqueName: \"kubernetes.io/projected/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-kube-api-access-gm5zx\") pod \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\" (UID: \"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8\") " Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.908375 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-kube-api-access-gm5zx" (OuterVolumeSpecName: "kube-api-access-gm5zx") pod "6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8" (UID: "6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8"). InnerVolumeSpecName "kube-api-access-gm5zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.917413 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-inventory" (OuterVolumeSpecName: "inventory") pod "6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8" (UID: "6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.935790 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8" (UID: "6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.992623 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm5zx\" (UniqueName: \"kubernetes.io/projected/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-kube-api-access-gm5zx\") on node \"crc\" DevicePath \"\"" Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.992672 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:26:08 crc kubenswrapper[4725]: I0225 11:26:08.992694 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.375114 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" event={"ID":"6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8","Type":"ContainerDied","Data":"d0f56efec895f19033b3d0404905f401efbd04e94d53941a31860911ef525b85"} Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.375155 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0f56efec895f19033b3d0404905f401efbd04e94d53941a31860911ef525b85" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.375172 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gx8f9" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.466677 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq"] Feb 25 11:26:09 crc kubenswrapper[4725]: E0225 11:26:09.467213 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.467236 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:26:09 crc kubenswrapper[4725]: E0225 11:26:09.467269 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd0fea4-28ef-4a6e-8b5b-7d137842723d" containerName="oc" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.467277 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd0fea4-28ef-4a6e-8b5b-7d137842723d" containerName="oc" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.467510 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd0fea4-28ef-4a6e-8b5b-7d137842723d" containerName="oc" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.467527 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.468219 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.470701 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.471086 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.471285 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.471565 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.477473 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq"] Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.607764 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.607857 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.608228 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkh5n\" (UniqueName: \"kubernetes.io/projected/2b6f1103-ca9d-4e09-9816-83e1751a56ff-kube-api-access-xkh5n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.710577 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.710665 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.710817 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkh5n\" (UniqueName: \"kubernetes.io/projected/2b6f1103-ca9d-4e09-9816-83e1751a56ff-kube-api-access-xkh5n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.716911 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.718517 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.738805 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkh5n\" (UniqueName: \"kubernetes.io/projected/2b6f1103-ca9d-4e09-9816-83e1751a56ff-kube-api-access-xkh5n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:09 crc kubenswrapper[4725]: I0225 11:26:09.821231 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:10 crc kubenswrapper[4725]: I0225 11:26:10.418918 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq"] Feb 25 11:26:10 crc kubenswrapper[4725]: W0225 11:26:10.427306 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b6f1103_ca9d_4e09_9816_83e1751a56ff.slice/crio-f615bc389ed887e7421c3386d40a5679dd07efd123a35ea5e70624eaf298d242 WatchSource:0}: Error finding container f615bc389ed887e7421c3386d40a5679dd07efd123a35ea5e70624eaf298d242: Status 404 returned error can't find the container with id f615bc389ed887e7421c3386d40a5679dd07efd123a35ea5e70624eaf298d242 Feb 25 11:26:11 crc kubenswrapper[4725]: I0225 11:26:11.395519 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" event={"ID":"2b6f1103-ca9d-4e09-9816-83e1751a56ff","Type":"ContainerStarted","Data":"e31cc7d619fb206d5159832852919ef923f07785678895607859072fe3f02c09"} Feb 25 11:26:11 crc kubenswrapper[4725]: I0225 11:26:11.395977 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" event={"ID":"2b6f1103-ca9d-4e09-9816-83e1751a56ff","Type":"ContainerStarted","Data":"f615bc389ed887e7421c3386d40a5679dd07efd123a35ea5e70624eaf298d242"} Feb 25 11:26:20 crc kubenswrapper[4725]: I0225 11:26:20.055367 4725 scope.go:117] "RemoveContainer" containerID="99e302b9e980f477bf5fa5dfb9f14b3e2ec114c5a5908f015ef8bbe7d8463d06" Feb 25 11:26:21 crc kubenswrapper[4725]: I0225 11:26:21.508051 4725 generic.go:334] "Generic (PLEG): container finished" podID="2b6f1103-ca9d-4e09-9816-83e1751a56ff" containerID="e31cc7d619fb206d5159832852919ef923f07785678895607859072fe3f02c09" exitCode=0 Feb 25 11:26:21 crc kubenswrapper[4725]: I0225 11:26:21.508161 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" event={"ID":"2b6f1103-ca9d-4e09-9816-83e1751a56ff","Type":"ContainerDied","Data":"e31cc7d619fb206d5159832852919ef923f07785678895607859072fe3f02c09"} Feb 25 11:26:22 crc kubenswrapper[4725]: I0225 11:26:22.934852 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:22 crc kubenswrapper[4725]: I0225 11:26:22.978497 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkh5n\" (UniqueName: \"kubernetes.io/projected/2b6f1103-ca9d-4e09-9816-83e1751a56ff-kube-api-access-xkh5n\") pod \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " Feb 25 11:26:22 crc kubenswrapper[4725]: I0225 11:26:22.978631 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-ssh-key-openstack-edpm-ipam\") pod \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " Feb 25 11:26:22 crc kubenswrapper[4725]: I0225 11:26:22.978797 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-inventory\") pod \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\" (UID: \"2b6f1103-ca9d-4e09-9816-83e1751a56ff\") " Feb 25 11:26:22 crc kubenswrapper[4725]: I0225 11:26:22.987099 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6f1103-ca9d-4e09-9816-83e1751a56ff-kube-api-access-xkh5n" (OuterVolumeSpecName: "kube-api-access-xkh5n") pod "2b6f1103-ca9d-4e09-9816-83e1751a56ff" (UID: "2b6f1103-ca9d-4e09-9816-83e1751a56ff"). InnerVolumeSpecName "kube-api-access-xkh5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.008455 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2b6f1103-ca9d-4e09-9816-83e1751a56ff" (UID: "2b6f1103-ca9d-4e09-9816-83e1751a56ff"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.017198 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-inventory" (OuterVolumeSpecName: "inventory") pod "2b6f1103-ca9d-4e09-9816-83e1751a56ff" (UID: "2b6f1103-ca9d-4e09-9816-83e1751a56ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.082218 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkh5n\" (UniqueName: \"kubernetes.io/projected/2b6f1103-ca9d-4e09-9816-83e1751a56ff-kube-api-access-xkh5n\") on node \"crc\" DevicePath \"\"" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.082274 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.082290 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b6f1103-ca9d-4e09-9816-83e1751a56ff-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.526479 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" event={"ID":"2b6f1103-ca9d-4e09-9816-83e1751a56ff","Type":"ContainerDied","Data":"f615bc389ed887e7421c3386d40a5679dd07efd123a35ea5e70624eaf298d242"} Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.526517 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f615bc389ed887e7421c3386d40a5679dd07efd123a35ea5e70624eaf298d242" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.526523 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.639874 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48"] Feb 25 11:26:23 crc kubenswrapper[4725]: E0225 11:26:23.640470 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6f1103-ca9d-4e09-9816-83e1751a56ff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.640537 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6f1103-ca9d-4e09-9816-83e1751a56ff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.640781 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6f1103-ca9d-4e09-9816-83e1751a56ff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.641532 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.643763 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.643983 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.644134 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.644432 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.644542 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.644727 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.644975 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.645366 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.659880 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48"] Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.694820 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.694920 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.694972 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztk8p\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-kube-api-access-ztk8p\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.694993 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695014 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695072 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695109 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695155 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695228 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695260 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695284 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695315 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695352 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.695437 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.796840 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.796919 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.796973 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797003 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797040 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztk8p\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-kube-api-access-ztk8p\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797057 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797077 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797097 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797113 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797196 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797219 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797235 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.797256 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.803209 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.803395 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.803598 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.803626 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.804046 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.804247 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.804419 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.804968 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.805022 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.805555 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.805746 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.806223 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.806909 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.826992 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztk8p\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-kube-api-access-ztk8p\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-skp48\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:23 crc kubenswrapper[4725]: I0225 11:26:23.957744 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:26:24 crc kubenswrapper[4725]: I0225 11:26:24.507769 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48"] Feb 25 11:26:24 crc kubenswrapper[4725]: W0225 11:26:24.511711 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb848df94_cae6_4ec8_bade_58be45c1cb4e.slice/crio-5db078df4e66a3a6e26e134a84ea3e5d8c693d3e10fc399ff7d05d199351db56 WatchSource:0}: Error finding container 5db078df4e66a3a6e26e134a84ea3e5d8c693d3e10fc399ff7d05d199351db56: Status 404 returned error can't find the container with id 5db078df4e66a3a6e26e134a84ea3e5d8c693d3e10fc399ff7d05d199351db56 Feb 25 11:26:24 crc kubenswrapper[4725]: I0225 11:26:24.539868 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" event={"ID":"b848df94-cae6-4ec8-bade-58be45c1cb4e","Type":"ContainerStarted","Data":"5db078df4e66a3a6e26e134a84ea3e5d8c693d3e10fc399ff7d05d199351db56"} Feb 25 11:26:25 crc kubenswrapper[4725]: I0225 11:26:25.552775 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" event={"ID":"b848df94-cae6-4ec8-bade-58be45c1cb4e","Type":"ContainerStarted","Data":"72d56aa0456521415ed50068d5495b8452c97e16574af9592ced68d9b9e6be0f"} Feb 25 11:26:25 crc kubenswrapper[4725]: I0225 11:26:25.578067 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" podStartSLOduration=2.107314519 podStartE2EDuration="2.578039159s" podCreationTimestamp="2026-02-25 11:26:23 +0000 UTC" firstStartedPulling="2026-02-25 11:26:24.516084721 +0000 UTC m=+2010.014666766" lastFinishedPulling="2026-02-25 11:26:24.986809371 +0000 UTC m=+2010.485391406" observedRunningTime="2026-02-25 11:26:25.574204818 +0000 UTC m=+2011.072786883" watchObservedRunningTime="2026-02-25 11:26:25.578039159 +0000 UTC m=+2011.076621224" Feb 25 11:27:03 crc kubenswrapper[4725]: I0225 11:27:03.909338 4725 generic.go:334] "Generic (PLEG): container finished" podID="b848df94-cae6-4ec8-bade-58be45c1cb4e" containerID="72d56aa0456521415ed50068d5495b8452c97e16574af9592ced68d9b9e6be0f" exitCode=0 Feb 25 11:27:03 crc kubenswrapper[4725]: I0225 11:27:03.909410 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" event={"ID":"b848df94-cae6-4ec8-bade-58be45c1cb4e","Type":"ContainerDied","Data":"72d56aa0456521415ed50068d5495b8452c97e16574af9592ced68d9b9e6be0f"} Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.350254 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462020 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-nova-combined-ca-bundle\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462075 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-telemetry-combined-ca-bundle\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462121 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztk8p\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-kube-api-access-ztk8p\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462175 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462197 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-repo-setup-combined-ca-bundle\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462220 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462267 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-bootstrap-combined-ca-bundle\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462336 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462378 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ovn-combined-ca-bundle\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462404 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-neutron-metadata-combined-ca-bundle\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462444 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462495 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ssh-key-openstack-edpm-ipam\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462519 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-inventory\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.462534 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-libvirt-combined-ca-bundle\") pod \"b848df94-cae6-4ec8-bade-58be45c1cb4e\" (UID: \"b848df94-cae6-4ec8-bade-58be45c1cb4e\") " Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.468787 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.468923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-kube-api-access-ztk8p" (OuterVolumeSpecName: "kube-api-access-ztk8p") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "kube-api-access-ztk8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.470763 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.470798 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.470879 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.471664 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.471810 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.473510 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.473751 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.473758 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.477044 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.481632 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.493413 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.512098 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-inventory" (OuterVolumeSpecName: "inventory") pod "b848df94-cae6-4ec8-bade-58be45c1cb4e" (UID: "b848df94-cae6-4ec8-bade-58be45c1cb4e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564871 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564915 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564930 4725 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564943 4725 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564955 4725 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564968 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztk8p\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-kube-api-access-ztk8p\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564981 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.564994 4725 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.565007 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.565021 4725 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.565033 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.565047 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.565059 4725 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848df94-cae6-4ec8-bade-58be45c1cb4e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.565073 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b848df94-cae6-4ec8-bade-58be45c1cb4e-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.930629 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" event={"ID":"b848df94-cae6-4ec8-bade-58be45c1cb4e","Type":"ContainerDied","Data":"5db078df4e66a3a6e26e134a84ea3e5d8c693d3e10fc399ff7d05d199351db56"} Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.930676 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db078df4e66a3a6e26e134a84ea3e5d8c693d3e10fc399ff7d05d199351db56" Feb 25 11:27:05 crc kubenswrapper[4725]: I0225 11:27:05.930701 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-skp48" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.043020 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr"] Feb 25 11:27:06 crc kubenswrapper[4725]: E0225 11:27:06.043646 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b848df94-cae6-4ec8-bade-58be45c1cb4e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.043737 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b848df94-cae6-4ec8-bade-58be45c1cb4e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.044027 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b848df94-cae6-4ec8-bade-58be45c1cb4e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.044741 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.047287 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.047786 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.047972 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.065122 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr"] Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.099184 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.100987 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.201668 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.201781 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.201811 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.201859 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.201981 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc5zr\" (UniqueName: \"kubernetes.io/projected/65453adf-918b-40e1-bce0-4d4cb4ab7f56-kube-api-access-nc5zr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.303637 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc5zr\" (UniqueName: \"kubernetes.io/projected/65453adf-918b-40e1-bce0-4d4cb4ab7f56-kube-api-access-nc5zr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.303699 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.303768 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.303788 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.303806 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.305332 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.308444 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.309664 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.310344 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.323705 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc5zr\" (UniqueName: \"kubernetes.io/projected/65453adf-918b-40e1-bce0-4d4cb4ab7f56-kube-api-access-nc5zr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgwzr\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.358745 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.894344 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr"] Feb 25 11:27:06 crc kubenswrapper[4725]: I0225 11:27:06.943321 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" event={"ID":"65453adf-918b-40e1-bce0-4d4cb4ab7f56","Type":"ContainerStarted","Data":"19d52ad965a9d5c475a82cfbcf399f30d13397614169c70ef3252971b8c0a772"} Feb 25 11:27:07 crc kubenswrapper[4725]: I0225 11:27:07.956998 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" event={"ID":"65453adf-918b-40e1-bce0-4d4cb4ab7f56","Type":"ContainerStarted","Data":"a556b0409fc0ab0a6d1224d6f6f6abe085baac678815a74307e97203618a31d0"} Feb 25 11:27:07 crc kubenswrapper[4725]: I0225 11:27:07.988187 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" podStartSLOduration=1.5711645509999999 podStartE2EDuration="1.988161218s" podCreationTimestamp="2026-02-25 11:27:06 +0000 UTC" firstStartedPulling="2026-02-25 11:27:06.906551463 +0000 UTC m=+2052.405133488" lastFinishedPulling="2026-02-25 11:27:07.32354811 +0000 UTC m=+2052.822130155" observedRunningTime="2026-02-25 11:27:07.975566327 +0000 UTC m=+2053.474148362" watchObservedRunningTime="2026-02-25 11:27:07.988161218 +0000 UTC m=+2053.486743253" Feb 25 11:27:41 crc kubenswrapper[4725]: I0225 11:27:41.555475 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:27:41 crc kubenswrapper[4725]: I0225 11:27:41.556137 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.172675 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533648-pdctk"] Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.174700 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533648-pdctk" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.178048 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.178482 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.179009 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.187545 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533648-pdctk"] Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.272891 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwc2\" (UniqueName: \"kubernetes.io/projected/09858624-9ec4-4226-b86d-c6fc95b91ba9-kube-api-access-bqwc2\") pod \"auto-csr-approver-29533648-pdctk\" (UID: \"09858624-9ec4-4226-b86d-c6fc95b91ba9\") " pod="openshift-infra/auto-csr-approver-29533648-pdctk" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.374618 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwc2\" (UniqueName: \"kubernetes.io/projected/09858624-9ec4-4226-b86d-c6fc95b91ba9-kube-api-access-bqwc2\") pod \"auto-csr-approver-29533648-pdctk\" (UID: \"09858624-9ec4-4226-b86d-c6fc95b91ba9\") " pod="openshift-infra/auto-csr-approver-29533648-pdctk" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.397664 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwc2\" (UniqueName: \"kubernetes.io/projected/09858624-9ec4-4226-b86d-c6fc95b91ba9-kube-api-access-bqwc2\") pod \"auto-csr-approver-29533648-pdctk\" (UID: \"09858624-9ec4-4226-b86d-c6fc95b91ba9\") " pod="openshift-infra/auto-csr-approver-29533648-pdctk" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.504177 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533648-pdctk" Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.953008 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533648-pdctk"] Feb 25 11:28:00 crc kubenswrapper[4725]: I0225 11:28:00.962137 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:28:01 crc kubenswrapper[4725]: I0225 11:28:01.452807 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533648-pdctk" event={"ID":"09858624-9ec4-4226-b86d-c6fc95b91ba9","Type":"ContainerStarted","Data":"5a0cb0837c4418f3d0d7dc94296bd8d498baff58e3b9fe8c6ec36cce350f4d15"} Feb 25 11:28:02 crc kubenswrapper[4725]: I0225 11:28:02.464711 4725 generic.go:334] "Generic (PLEG): container finished" podID="09858624-9ec4-4226-b86d-c6fc95b91ba9" containerID="36fc4f5ac8d7b9bc2a7afb4209b58c4a9bc204b872b1d5d385ca72af86f632b3" exitCode=0 Feb 25 11:28:02 crc kubenswrapper[4725]: I0225 11:28:02.465069 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533648-pdctk" event={"ID":"09858624-9ec4-4226-b86d-c6fc95b91ba9","Type":"ContainerDied","Data":"36fc4f5ac8d7b9bc2a7afb4209b58c4a9bc204b872b1d5d385ca72af86f632b3"} Feb 25 11:28:03 crc kubenswrapper[4725]: I0225 11:28:03.802520 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533648-pdctk" Feb 25 11:28:03 crc kubenswrapper[4725]: I0225 11:28:03.846724 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqwc2\" (UniqueName: \"kubernetes.io/projected/09858624-9ec4-4226-b86d-c6fc95b91ba9-kube-api-access-bqwc2\") pod \"09858624-9ec4-4226-b86d-c6fc95b91ba9\" (UID: \"09858624-9ec4-4226-b86d-c6fc95b91ba9\") " Feb 25 11:28:03 crc kubenswrapper[4725]: I0225 11:28:03.852618 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09858624-9ec4-4226-b86d-c6fc95b91ba9-kube-api-access-bqwc2" (OuterVolumeSpecName: "kube-api-access-bqwc2") pod "09858624-9ec4-4226-b86d-c6fc95b91ba9" (UID: "09858624-9ec4-4226-b86d-c6fc95b91ba9"). InnerVolumeSpecName "kube-api-access-bqwc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:28:03 crc kubenswrapper[4725]: I0225 11:28:03.949641 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqwc2\" (UniqueName: \"kubernetes.io/projected/09858624-9ec4-4226-b86d-c6fc95b91ba9-kube-api-access-bqwc2\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:04 crc kubenswrapper[4725]: I0225 11:28:04.485741 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533648-pdctk" event={"ID":"09858624-9ec4-4226-b86d-c6fc95b91ba9","Type":"ContainerDied","Data":"5a0cb0837c4418f3d0d7dc94296bd8d498baff58e3b9fe8c6ec36cce350f4d15"} Feb 25 11:28:04 crc kubenswrapper[4725]: I0225 11:28:04.486069 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a0cb0837c4418f3d0d7dc94296bd8d498baff58e3b9fe8c6ec36cce350f4d15" Feb 25 11:28:04 crc kubenswrapper[4725]: I0225 11:28:04.485882 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533648-pdctk" Feb 25 11:28:04 crc kubenswrapper[4725]: I0225 11:28:04.934080 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533642-wgj8t"] Feb 25 11:28:04 crc kubenswrapper[4725]: I0225 11:28:04.947727 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533642-wgj8t"] Feb 25 11:28:05 crc kubenswrapper[4725]: I0225 11:28:05.236253 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685d89a8-03f3-40ab-849f-27f44039ebe9" path="/var/lib/kubelet/pods/685d89a8-03f3-40ab-849f-27f44039ebe9/volumes" Feb 25 11:28:11 crc kubenswrapper[4725]: I0225 11:28:11.555448 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:28:11 crc kubenswrapper[4725]: I0225 11:28:11.555936 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:28:12 crc kubenswrapper[4725]: I0225 11:28:12.572089 4725 generic.go:334] "Generic (PLEG): container finished" podID="65453adf-918b-40e1-bce0-4d4cb4ab7f56" containerID="a556b0409fc0ab0a6d1224d6f6f6abe085baac678815a74307e97203618a31d0" exitCode=0 Feb 25 11:28:12 crc kubenswrapper[4725]: I0225 11:28:12.572147 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" event={"ID":"65453adf-918b-40e1-bce0-4d4cb4ab7f56","Type":"ContainerDied","Data":"a556b0409fc0ab0a6d1224d6f6f6abe085baac678815a74307e97203618a31d0"} Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.023165 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.148644 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovn-combined-ca-bundle\") pod \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.149107 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovncontroller-config-0\") pod \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.149237 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc5zr\" (UniqueName: \"kubernetes.io/projected/65453adf-918b-40e1-bce0-4d4cb4ab7f56-kube-api-access-nc5zr\") pod \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.149347 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-inventory\") pod \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.149529 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ssh-key-openstack-edpm-ipam\") pod \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\" (UID: \"65453adf-918b-40e1-bce0-4d4cb4ab7f56\") " Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.156878 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65453adf-918b-40e1-bce0-4d4cb4ab7f56-kube-api-access-nc5zr" (OuterVolumeSpecName: "kube-api-access-nc5zr") pod "65453adf-918b-40e1-bce0-4d4cb4ab7f56" (UID: "65453adf-918b-40e1-bce0-4d4cb4ab7f56"). InnerVolumeSpecName "kube-api-access-nc5zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.157233 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "65453adf-918b-40e1-bce0-4d4cb4ab7f56" (UID: "65453adf-918b-40e1-bce0-4d4cb4ab7f56"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.176621 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "65453adf-918b-40e1-bce0-4d4cb4ab7f56" (UID: "65453adf-918b-40e1-bce0-4d4cb4ab7f56"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.190031 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "65453adf-918b-40e1-bce0-4d4cb4ab7f56" (UID: "65453adf-918b-40e1-bce0-4d4cb4ab7f56"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.190550 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-inventory" (OuterVolumeSpecName: "inventory") pod "65453adf-918b-40e1-bce0-4d4cb4ab7f56" (UID: "65453adf-918b-40e1-bce0-4d4cb4ab7f56"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.251599 4725 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.251657 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc5zr\" (UniqueName: \"kubernetes.io/projected/65453adf-918b-40e1-bce0-4d4cb4ab7f56-kube-api-access-nc5zr\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.251677 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.251695 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.251713 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65453adf-918b-40e1-bce0-4d4cb4ab7f56-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.598068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" event={"ID":"65453adf-918b-40e1-bce0-4d4cb4ab7f56","Type":"ContainerDied","Data":"19d52ad965a9d5c475a82cfbcf399f30d13397614169c70ef3252971b8c0a772"} Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.598116 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19d52ad965a9d5c475a82cfbcf399f30d13397614169c70ef3252971b8c0a772" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.598145 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgwzr" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.691538 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9"] Feb 25 11:28:14 crc kubenswrapper[4725]: E0225 11:28:14.692377 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65453adf-918b-40e1-bce0-4d4cb4ab7f56" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.692410 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="65453adf-918b-40e1-bce0-4d4cb4ab7f56" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 11:28:14 crc kubenswrapper[4725]: E0225 11:28:14.692462 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09858624-9ec4-4226-b86d-c6fc95b91ba9" containerName="oc" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.692473 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="09858624-9ec4-4226-b86d-c6fc95b91ba9" containerName="oc" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.692771 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="09858624-9ec4-4226-b86d-c6fc95b91ba9" containerName="oc" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.692808 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="65453adf-918b-40e1-bce0-4d4cb4ab7f56" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.694024 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.697731 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.698743 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9"] Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.701564 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.701769 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.702082 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.702085 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.703587 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.763936 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.763981 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.764164 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2zcq\" (UniqueName: \"kubernetes.io/projected/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-kube-api-access-z2zcq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.764389 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.764671 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.764910 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.866996 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.867077 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.867169 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.867189 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.867207 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2zcq\" (UniqueName: \"kubernetes.io/projected/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-kube-api-access-z2zcq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.867237 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.870659 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.880082 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.880211 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.880979 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.882445 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:14 crc kubenswrapper[4725]: I0225 11:28:14.888480 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2zcq\" (UniqueName: \"kubernetes.io/projected/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-kube-api-access-z2zcq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:15 crc kubenswrapper[4725]: I0225 11:28:15.028742 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:28:15 crc kubenswrapper[4725]: W0225 11:28:15.556251 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9479ee63_ae8c_4dfb_87f0_d92785a85f3b.slice/crio-0910be5a7ffe0753bd371d50fd75f0c82d2d11c03df26e7ad8dfac16cedc17b1 WatchSource:0}: Error finding container 0910be5a7ffe0753bd371d50fd75f0c82d2d11c03df26e7ad8dfac16cedc17b1: Status 404 returned error can't find the container with id 0910be5a7ffe0753bd371d50fd75f0c82d2d11c03df26e7ad8dfac16cedc17b1 Feb 25 11:28:15 crc kubenswrapper[4725]: I0225 11:28:15.558234 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9"] Feb 25 11:28:15 crc kubenswrapper[4725]: I0225 11:28:15.611690 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" event={"ID":"9479ee63-ae8c-4dfb-87f0-d92785a85f3b","Type":"ContainerStarted","Data":"0910be5a7ffe0753bd371d50fd75f0c82d2d11c03df26e7ad8dfac16cedc17b1"} Feb 25 11:28:16 crc kubenswrapper[4725]: I0225 11:28:16.637223 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" event={"ID":"9479ee63-ae8c-4dfb-87f0-d92785a85f3b","Type":"ContainerStarted","Data":"82322b0022ad0b2f63c56bbc22c4f54682cbff523b2df198623b010b9df77607"} Feb 25 11:28:16 crc kubenswrapper[4725]: I0225 11:28:16.691212 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" podStartSLOduration=2.00244982 podStartE2EDuration="2.691195353s" podCreationTimestamp="2026-02-25 11:28:14 +0000 UTC" firstStartedPulling="2026-02-25 11:28:15.563134957 +0000 UTC m=+2121.061716982" lastFinishedPulling="2026-02-25 11:28:16.25188049 +0000 UTC m=+2121.750462515" observedRunningTime="2026-02-25 11:28:16.686158341 +0000 UTC m=+2122.184740376" watchObservedRunningTime="2026-02-25 11:28:16.691195353 +0000 UTC m=+2122.189777378" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.175382 4725 scope.go:117] "RemoveContainer" containerID="ea4c686170b5ab97a3c63c9f80407722fe3ffbabcf0db08138db8f066669bc36" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.843032 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pxzsw"] Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.845703 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.856095 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxzsw"] Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.885869 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-utilities\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.885924 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-catalog-content\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.886089 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnj5l\" (UniqueName: \"kubernetes.io/projected/6c2c47d0-c5f6-4e0c-9436-0e232db99667-kube-api-access-mnj5l\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.988355 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-utilities\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.988413 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-catalog-content\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.988564 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnj5l\" (UniqueName: \"kubernetes.io/projected/6c2c47d0-c5f6-4e0c-9436-0e232db99667-kube-api-access-mnj5l\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.988977 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-utilities\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:20 crc kubenswrapper[4725]: I0225 11:28:20.988983 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-catalog-content\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:21 crc kubenswrapper[4725]: I0225 11:28:21.024777 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnj5l\" (UniqueName: \"kubernetes.io/projected/6c2c47d0-c5f6-4e0c-9436-0e232db99667-kube-api-access-mnj5l\") pod \"community-operators-pxzsw\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:21 crc kubenswrapper[4725]: I0225 11:28:21.170190 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:21 crc kubenswrapper[4725]: W0225 11:28:21.725604 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2c47d0_c5f6_4e0c_9436_0e232db99667.slice/crio-abe27fe4fd1dc4762f7e43bb10c9e11399faf3abb861892b0942d3e4f055ef82 WatchSource:0}: Error finding container abe27fe4fd1dc4762f7e43bb10c9e11399faf3abb861892b0942d3e4f055ef82: Status 404 returned error can't find the container with id abe27fe4fd1dc4762f7e43bb10c9e11399faf3abb861892b0942d3e4f055ef82 Feb 25 11:28:21 crc kubenswrapper[4725]: I0225 11:28:21.726507 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxzsw"] Feb 25 11:28:22 crc kubenswrapper[4725]: I0225 11:28:22.690386 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerID="171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04" exitCode=0 Feb 25 11:28:22 crc kubenswrapper[4725]: I0225 11:28:22.690469 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxzsw" event={"ID":"6c2c47d0-c5f6-4e0c-9436-0e232db99667","Type":"ContainerDied","Data":"171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04"} Feb 25 11:28:22 crc kubenswrapper[4725]: I0225 11:28:22.690749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxzsw" event={"ID":"6c2c47d0-c5f6-4e0c-9436-0e232db99667","Type":"ContainerStarted","Data":"abe27fe4fd1dc4762f7e43bb10c9e11399faf3abb861892b0942d3e4f055ef82"} Feb 25 11:28:23 crc kubenswrapper[4725]: I0225 11:28:23.720389 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxzsw" event={"ID":"6c2c47d0-c5f6-4e0c-9436-0e232db99667","Type":"ContainerStarted","Data":"2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b"} Feb 25 11:28:24 crc kubenswrapper[4725]: I0225 11:28:24.733789 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerID="2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b" exitCode=0 Feb 25 11:28:24 crc kubenswrapper[4725]: I0225 11:28:24.733868 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxzsw" event={"ID":"6c2c47d0-c5f6-4e0c-9436-0e232db99667","Type":"ContainerDied","Data":"2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b"} Feb 25 11:28:25 crc kubenswrapper[4725]: I0225 11:28:25.744355 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxzsw" event={"ID":"6c2c47d0-c5f6-4e0c-9436-0e232db99667","Type":"ContainerStarted","Data":"5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81"} Feb 25 11:28:25 crc kubenswrapper[4725]: I0225 11:28:25.787681 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pxzsw" podStartSLOduration=3.358600058 podStartE2EDuration="5.787647288s" podCreationTimestamp="2026-02-25 11:28:20 +0000 UTC" firstStartedPulling="2026-02-25 11:28:22.692439059 +0000 UTC m=+2128.191021084" lastFinishedPulling="2026-02-25 11:28:25.121486289 +0000 UTC m=+2130.620068314" observedRunningTime="2026-02-25 11:28:25.773099525 +0000 UTC m=+2131.271681570" watchObservedRunningTime="2026-02-25 11:28:25.787647288 +0000 UTC m=+2131.286229343" Feb 25 11:28:31 crc kubenswrapper[4725]: I0225 11:28:31.171165 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:31 crc kubenswrapper[4725]: I0225 11:28:31.171820 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:31 crc kubenswrapper[4725]: I0225 11:28:31.223077 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:32 crc kubenswrapper[4725]: I0225 11:28:32.237114 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:32 crc kubenswrapper[4725]: I0225 11:28:32.296068 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxzsw"] Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.205402 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pxzsw" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="registry-server" containerID="cri-o://5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81" gracePeriod=2 Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.678252 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.778330 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnj5l\" (UniqueName: \"kubernetes.io/projected/6c2c47d0-c5f6-4e0c-9436-0e232db99667-kube-api-access-mnj5l\") pod \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.778491 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-utilities\") pod \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.778569 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-catalog-content\") pod \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\" (UID: \"6c2c47d0-c5f6-4e0c-9436-0e232db99667\") " Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.779285 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-utilities" (OuterVolumeSpecName: "utilities") pod "6c2c47d0-c5f6-4e0c-9436-0e232db99667" (UID: "6c2c47d0-c5f6-4e0c-9436-0e232db99667"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.785090 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2c47d0-c5f6-4e0c-9436-0e232db99667-kube-api-access-mnj5l" (OuterVolumeSpecName: "kube-api-access-mnj5l") pod "6c2c47d0-c5f6-4e0c-9436-0e232db99667" (UID: "6c2c47d0-c5f6-4e0c-9436-0e232db99667"). InnerVolumeSpecName "kube-api-access-mnj5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.827669 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c2c47d0-c5f6-4e0c-9436-0e232db99667" (UID: "6c2c47d0-c5f6-4e0c-9436-0e232db99667"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.880254 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.880291 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnj5l\" (UniqueName: \"kubernetes.io/projected/6c2c47d0-c5f6-4e0c-9436-0e232db99667-kube-api-access-mnj5l\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:34 crc kubenswrapper[4725]: I0225 11:28:34.880305 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c2c47d0-c5f6-4e0c-9436-0e232db99667-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.214793 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerID="5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81" exitCode=0 Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.214881 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxzsw" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.214900 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxzsw" event={"ID":"6c2c47d0-c5f6-4e0c-9436-0e232db99667","Type":"ContainerDied","Data":"5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81"} Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.215619 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxzsw" event={"ID":"6c2c47d0-c5f6-4e0c-9436-0e232db99667","Type":"ContainerDied","Data":"abe27fe4fd1dc4762f7e43bb10c9e11399faf3abb861892b0942d3e4f055ef82"} Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.215660 4725 scope.go:117] "RemoveContainer" containerID="5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.237749 4725 scope.go:117] "RemoveContainer" containerID="2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.280200 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxzsw"] Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.290017 4725 scope.go:117] "RemoveContainer" containerID="171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.296307 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pxzsw"] Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.320271 4725 scope.go:117] "RemoveContainer" containerID="5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81" Feb 25 11:28:35 crc kubenswrapper[4725]: E0225 11:28:35.321236 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81\": container with ID starting with 5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81 not found: ID does not exist" containerID="5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.321281 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81"} err="failed to get container status \"5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81\": rpc error: code = NotFound desc = could not find container \"5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81\": container with ID starting with 5677a66ed984a886cf40ee05615431accee2d108f260c7375821e24732110c81 not found: ID does not exist" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.321332 4725 scope.go:117] "RemoveContainer" containerID="2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b" Feb 25 11:28:35 crc kubenswrapper[4725]: E0225 11:28:35.321715 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b\": container with ID starting with 2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b not found: ID does not exist" containerID="2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.321771 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b"} err="failed to get container status \"2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b\": rpc error: code = NotFound desc = could not find container \"2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b\": container with ID starting with 2bed3fbfb1891b56986f50928a7abe62daece46fb5b725c1d81fc25cd1b9452b not found: ID does not exist" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.321792 4725 scope.go:117] "RemoveContainer" containerID="171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04" Feb 25 11:28:35 crc kubenswrapper[4725]: E0225 11:28:35.322243 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04\": container with ID starting with 171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04 not found: ID does not exist" containerID="171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04" Feb 25 11:28:35 crc kubenswrapper[4725]: I0225 11:28:35.322266 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04"} err="failed to get container status \"171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04\": rpc error: code = NotFound desc = could not find container \"171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04\": container with ID starting with 171d11b05494a6beeba1c2317301deecd71caf3bf087b814bce54ff376398a04 not found: ID does not exist" Feb 25 11:28:37 crc kubenswrapper[4725]: I0225 11:28:37.236753 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" path="/var/lib/kubelet/pods/6c2c47d0-c5f6-4e0c-9436-0e232db99667/volumes" Feb 25 11:28:41 crc kubenswrapper[4725]: I0225 11:28:41.555649 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:28:41 crc kubenswrapper[4725]: I0225 11:28:41.556313 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:28:41 crc kubenswrapper[4725]: I0225 11:28:41.556361 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:28:41 crc kubenswrapper[4725]: I0225 11:28:41.557186 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"add76c268fa48b85bd8b4a73353a88415ac719328ee98d349951379413d37c8f"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:28:41 crc kubenswrapper[4725]: I0225 11:28:41.557273 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://add76c268fa48b85bd8b4a73353a88415ac719328ee98d349951379413d37c8f" gracePeriod=600 Feb 25 11:28:42 crc kubenswrapper[4725]: I0225 11:28:42.276794 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="add76c268fa48b85bd8b4a73353a88415ac719328ee98d349951379413d37c8f" exitCode=0 Feb 25 11:28:42 crc kubenswrapper[4725]: I0225 11:28:42.276864 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"add76c268fa48b85bd8b4a73353a88415ac719328ee98d349951379413d37c8f"} Feb 25 11:28:42 crc kubenswrapper[4725]: I0225 11:28:42.277252 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378"} Feb 25 11:28:42 crc kubenswrapper[4725]: I0225 11:28:42.277268 4725 scope.go:117] "RemoveContainer" containerID="bae0f893d8f6a848873f5da8d4118058de962aacba8f71babe2cfbc7f963fae5" Feb 25 11:29:06 crc kubenswrapper[4725]: I0225 11:29:06.510958 4725 generic.go:334] "Generic (PLEG): container finished" podID="9479ee63-ae8c-4dfb-87f0-d92785a85f3b" containerID="82322b0022ad0b2f63c56bbc22c4f54682cbff523b2df198623b010b9df77607" exitCode=0 Feb 25 11:29:06 crc kubenswrapper[4725]: I0225 11:29:06.511059 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" event={"ID":"9479ee63-ae8c-4dfb-87f0-d92785a85f3b","Type":"ContainerDied","Data":"82322b0022ad0b2f63c56bbc22c4f54682cbff523b2df198623b010b9df77607"} Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.909230 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.931174 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-metadata-combined-ca-bundle\") pod \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.931414 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-ssh-key-openstack-edpm-ipam\") pod \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.931492 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-inventory\") pod \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.931582 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.931684 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2zcq\" (UniqueName: \"kubernetes.io/projected/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-kube-api-access-z2zcq\") pod \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.931746 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-nova-metadata-neutron-config-0\") pod \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\" (UID: \"9479ee63-ae8c-4dfb-87f0-d92785a85f3b\") " Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.938767 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-kube-api-access-z2zcq" (OuterVolumeSpecName: "kube-api-access-z2zcq") pod "9479ee63-ae8c-4dfb-87f0-d92785a85f3b" (UID: "9479ee63-ae8c-4dfb-87f0-d92785a85f3b"). InnerVolumeSpecName "kube-api-access-z2zcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.939320 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9479ee63-ae8c-4dfb-87f0-d92785a85f3b" (UID: "9479ee63-ae8c-4dfb-87f0-d92785a85f3b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.967333 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-inventory" (OuterVolumeSpecName: "inventory") pod "9479ee63-ae8c-4dfb-87f0-d92785a85f3b" (UID: "9479ee63-ae8c-4dfb-87f0-d92785a85f3b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.967452 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9479ee63-ae8c-4dfb-87f0-d92785a85f3b" (UID: "9479ee63-ae8c-4dfb-87f0-d92785a85f3b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.968217 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9479ee63-ae8c-4dfb-87f0-d92785a85f3b" (UID: "9479ee63-ae8c-4dfb-87f0-d92785a85f3b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:29:07 crc kubenswrapper[4725]: I0225 11:29:07.969291 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9479ee63-ae8c-4dfb-87f0-d92785a85f3b" (UID: "9479ee63-ae8c-4dfb-87f0-d92785a85f3b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.033849 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.033885 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.033900 4725 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.033912 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2zcq\" (UniqueName: \"kubernetes.io/projected/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-kube-api-access-z2zcq\") on node \"crc\" DevicePath \"\"" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.033929 4725 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.033940 4725 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9479ee63-ae8c-4dfb-87f0-d92785a85f3b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.532636 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" event={"ID":"9479ee63-ae8c-4dfb-87f0-d92785a85f3b","Type":"ContainerDied","Data":"0910be5a7ffe0753bd371d50fd75f0c82d2d11c03df26e7ad8dfac16cedc17b1"} Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.532696 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0910be5a7ffe0753bd371d50fd75f0c82d2d11c03df26e7ad8dfac16cedc17b1" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.532719 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.621049 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n"] Feb 25 11:29:08 crc kubenswrapper[4725]: E0225 11:29:08.621390 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="extract-content" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.621407 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="extract-content" Feb 25 11:29:08 crc kubenswrapper[4725]: E0225 11:29:08.621424 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="registry-server" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.621432 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="registry-server" Feb 25 11:29:08 crc kubenswrapper[4725]: E0225 11:29:08.621446 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9479ee63-ae8c-4dfb-87f0-d92785a85f3b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.621453 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9479ee63-ae8c-4dfb-87f0-d92785a85f3b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 11:29:08 crc kubenswrapper[4725]: E0225 11:29:08.621463 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="extract-utilities" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.621469 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="extract-utilities" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.621652 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2c47d0-c5f6-4e0c-9436-0e232db99667" containerName="registry-server" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.621663 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9479ee63-ae8c-4dfb-87f0-d92785a85f3b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.622335 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.624621 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.624805 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.624870 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.624976 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.625003 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.644224 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.644379 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2mzg\" (UniqueName: \"kubernetes.io/projected/6c225171-2b3a-414b-94d4-d73cc4d28b97-kube-api-access-r2mzg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.644482 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.644585 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.644809 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.652132 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n"] Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.747193 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.747259 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.747305 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.747404 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2mzg\" (UniqueName: \"kubernetes.io/projected/6c225171-2b3a-414b-94d4-d73cc4d28b97-kube-api-access-r2mzg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.747430 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.751426 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.751441 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.751441 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.752348 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.769133 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2mzg\" (UniqueName: \"kubernetes.io/projected/6c225171-2b3a-414b-94d4-d73cc4d28b97-kube-api-access-r2mzg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:08 crc kubenswrapper[4725]: I0225 11:29:08.949170 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:29:09 crc kubenswrapper[4725]: I0225 11:29:09.540966 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n"] Feb 25 11:29:10 crc kubenswrapper[4725]: I0225 11:29:10.554201 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" event={"ID":"6c225171-2b3a-414b-94d4-d73cc4d28b97","Type":"ContainerStarted","Data":"e5df1b56b38b80110cd37aafa6a07afafb20b221d406b90f3ca3c2dd7e39a492"} Feb 25 11:29:10 crc kubenswrapper[4725]: I0225 11:29:10.554266 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" event={"ID":"6c225171-2b3a-414b-94d4-d73cc4d28b97","Type":"ContainerStarted","Data":"aa22a4b3519a6684a9aa26dbb3b1a1fe42def6b01430d35747515741b51acc10"} Feb 25 11:29:10 crc kubenswrapper[4725]: I0225 11:29:10.570145 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" podStartSLOduration=1.918297919 podStartE2EDuration="2.57012382s" podCreationTimestamp="2026-02-25 11:29:08 +0000 UTC" firstStartedPulling="2026-02-25 11:29:09.551504687 +0000 UTC m=+2175.050086722" lastFinishedPulling="2026-02-25 11:29:10.203330588 +0000 UTC m=+2175.701912623" observedRunningTime="2026-02-25 11:29:10.568261 +0000 UTC m=+2176.066843015" watchObservedRunningTime="2026-02-25 11:29:10.57012382 +0000 UTC m=+2176.068705845" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.168200 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533650-xfh7n"] Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.170323 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533650-xfh7n" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.172576 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.173158 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.173210 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.184389 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533650-xfh7n"] Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.257774 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt"] Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.259095 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.261412 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.264161 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.266844 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt"] Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.297705 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkm4\" (UniqueName: \"kubernetes.io/projected/2c240bb9-703d-46d4-81b8-6f733dac4d9d-kube-api-access-6dkm4\") pod \"auto-csr-approver-29533650-xfh7n\" (UID: \"2c240bb9-703d-46d4-81b8-6f733dac4d9d\") " pod="openshift-infra/auto-csr-approver-29533650-xfh7n" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.400145 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/218d899e-34d3-466f-93f7-6fea492a5105-secret-volume\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.400222 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h69rq\" (UniqueName: \"kubernetes.io/projected/218d899e-34d3-466f-93f7-6fea492a5105-kube-api-access-h69rq\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.400415 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkm4\" (UniqueName: \"kubernetes.io/projected/2c240bb9-703d-46d4-81b8-6f733dac4d9d-kube-api-access-6dkm4\") pod \"auto-csr-approver-29533650-xfh7n\" (UID: \"2c240bb9-703d-46d4-81b8-6f733dac4d9d\") " pod="openshift-infra/auto-csr-approver-29533650-xfh7n" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.400624 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218d899e-34d3-466f-93f7-6fea492a5105-config-volume\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.425144 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkm4\" (UniqueName: \"kubernetes.io/projected/2c240bb9-703d-46d4-81b8-6f733dac4d9d-kube-api-access-6dkm4\") pod \"auto-csr-approver-29533650-xfh7n\" (UID: \"2c240bb9-703d-46d4-81b8-6f733dac4d9d\") " pod="openshift-infra/auto-csr-approver-29533650-xfh7n" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.502444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218d899e-34d3-466f-93f7-6fea492a5105-config-volume\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.502600 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/218d899e-34d3-466f-93f7-6fea492a5105-secret-volume\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.502678 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h69rq\" (UniqueName: \"kubernetes.io/projected/218d899e-34d3-466f-93f7-6fea492a5105-kube-api-access-h69rq\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.503599 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218d899e-34d3-466f-93f7-6fea492a5105-config-volume\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.506258 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533650-xfh7n" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.507595 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/218d899e-34d3-466f-93f7-6fea492a5105-secret-volume\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.524784 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h69rq\" (UniqueName: \"kubernetes.io/projected/218d899e-34d3-466f-93f7-6fea492a5105-kube-api-access-h69rq\") pod \"collect-profiles-29533650-xsjwt\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:00 crc kubenswrapper[4725]: I0225 11:30:00.573482 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:01 crc kubenswrapper[4725]: I0225 11:30:01.029554 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533650-xfh7n"] Feb 25 11:30:01 crc kubenswrapper[4725]: W0225 11:30:01.034021 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c240bb9_703d_46d4_81b8_6f733dac4d9d.slice/crio-2e57b18d64bce2f3d2ee6ceca3bafdd8b4f1fa6d80d76ca81c31ffbaa7c70536 WatchSource:0}: Error finding container 2e57b18d64bce2f3d2ee6ceca3bafdd8b4f1fa6d80d76ca81c31ffbaa7c70536: Status 404 returned error can't find the container with id 2e57b18d64bce2f3d2ee6ceca3bafdd8b4f1fa6d80d76ca81c31ffbaa7c70536 Feb 25 11:30:01 crc kubenswrapper[4725]: I0225 11:30:01.090381 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt"] Feb 25 11:30:02 crc kubenswrapper[4725]: I0225 11:30:02.054547 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533650-xfh7n" event={"ID":"2c240bb9-703d-46d4-81b8-6f733dac4d9d","Type":"ContainerStarted","Data":"2e57b18d64bce2f3d2ee6ceca3bafdd8b4f1fa6d80d76ca81c31ffbaa7c70536"} Feb 25 11:30:02 crc kubenswrapper[4725]: I0225 11:30:02.056909 4725 generic.go:334] "Generic (PLEG): container finished" podID="218d899e-34d3-466f-93f7-6fea492a5105" containerID="8988fa751aea948980118659f3fdad7948605373f0b6ad6055646806f7139eaf" exitCode=0 Feb 25 11:30:02 crc kubenswrapper[4725]: I0225 11:30:02.056949 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" event={"ID":"218d899e-34d3-466f-93f7-6fea492a5105","Type":"ContainerDied","Data":"8988fa751aea948980118659f3fdad7948605373f0b6ad6055646806f7139eaf"} Feb 25 11:30:02 crc kubenswrapper[4725]: I0225 11:30:02.056975 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" event={"ID":"218d899e-34d3-466f-93f7-6fea492a5105","Type":"ContainerStarted","Data":"9c154fc9754a31404e71262d6dcd152dc98ae4c5adbdc61bc72ca6f5d9bbc343"} Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.492444 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.584817 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/218d899e-34d3-466f-93f7-6fea492a5105-secret-volume\") pod \"218d899e-34d3-466f-93f7-6fea492a5105\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.584924 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h69rq\" (UniqueName: \"kubernetes.io/projected/218d899e-34d3-466f-93f7-6fea492a5105-kube-api-access-h69rq\") pod \"218d899e-34d3-466f-93f7-6fea492a5105\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.584969 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218d899e-34d3-466f-93f7-6fea492a5105-config-volume\") pod \"218d899e-34d3-466f-93f7-6fea492a5105\" (UID: \"218d899e-34d3-466f-93f7-6fea492a5105\") " Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.586460 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/218d899e-34d3-466f-93f7-6fea492a5105-config-volume" (OuterVolumeSpecName: "config-volume") pod "218d899e-34d3-466f-93f7-6fea492a5105" (UID: "218d899e-34d3-466f-93f7-6fea492a5105"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.593479 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218d899e-34d3-466f-93f7-6fea492a5105-kube-api-access-h69rq" (OuterVolumeSpecName: "kube-api-access-h69rq") pod "218d899e-34d3-466f-93f7-6fea492a5105" (UID: "218d899e-34d3-466f-93f7-6fea492a5105"). InnerVolumeSpecName "kube-api-access-h69rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.603990 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218d899e-34d3-466f-93f7-6fea492a5105-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "218d899e-34d3-466f-93f7-6fea492a5105" (UID: "218d899e-34d3-466f-93f7-6fea492a5105"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.687978 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/218d899e-34d3-466f-93f7-6fea492a5105-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.688458 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h69rq\" (UniqueName: \"kubernetes.io/projected/218d899e-34d3-466f-93f7-6fea492a5105-kube-api-access-h69rq\") on node \"crc\" DevicePath \"\"" Feb 25 11:30:03 crc kubenswrapper[4725]: I0225 11:30:03.688470 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218d899e-34d3-466f-93f7-6fea492a5105-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.076294 4725 generic.go:334] "Generic (PLEG): container finished" podID="2c240bb9-703d-46d4-81b8-6f733dac4d9d" containerID="254dc42e26a8a333d26d1d3363566838ecbebf27979b9e32acb733e23b13c0f8" exitCode=0 Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.076376 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533650-xfh7n" event={"ID":"2c240bb9-703d-46d4-81b8-6f733dac4d9d","Type":"ContainerDied","Data":"254dc42e26a8a333d26d1d3363566838ecbebf27979b9e32acb733e23b13c0f8"} Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.079068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" event={"ID":"218d899e-34d3-466f-93f7-6fea492a5105","Type":"ContainerDied","Data":"9c154fc9754a31404e71262d6dcd152dc98ae4c5adbdc61bc72ca6f5d9bbc343"} Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.079113 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533650-xsjwt" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.079120 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c154fc9754a31404e71262d6dcd152dc98ae4c5adbdc61bc72ca6f5d9bbc343" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.587521 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l"] Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.607411 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533605-22g2l"] Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.639927 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6lf6t"] Feb 25 11:30:04 crc kubenswrapper[4725]: E0225 11:30:04.640514 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218d899e-34d3-466f-93f7-6fea492a5105" containerName="collect-profiles" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.640602 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="218d899e-34d3-466f-93f7-6fea492a5105" containerName="collect-profiles" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.640946 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="218d899e-34d3-466f-93f7-6fea492a5105" containerName="collect-profiles" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.642586 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.652373 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lf6t"] Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.838145 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28l4\" (UniqueName: \"kubernetes.io/projected/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-kube-api-access-d28l4\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.838189 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-catalog-content\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.838237 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-utilities\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.939690 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-utilities\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.940129 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28l4\" (UniqueName: \"kubernetes.io/projected/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-kube-api-access-d28l4\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.940159 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-catalog-content\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.940352 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-utilities\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.941893 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-catalog-content\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:04 crc kubenswrapper[4725]: I0225 11:30:04.962200 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28l4\" (UniqueName: \"kubernetes.io/projected/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-kube-api-access-d28l4\") pod \"redhat-marketplace-6lf6t\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:05 crc kubenswrapper[4725]: I0225 11:30:05.235977 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fe5978-cb79-459f-b51a-b8f769ea177f" path="/var/lib/kubelet/pods/08fe5978-cb79-459f-b51a-b8f769ea177f/volumes" Feb 25 11:30:05 crc kubenswrapper[4725]: I0225 11:30:05.260556 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:05 crc kubenswrapper[4725]: I0225 11:30:05.462542 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533650-xfh7n" Feb 25 11:30:05 crc kubenswrapper[4725]: I0225 11:30:05.559992 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dkm4\" (UniqueName: \"kubernetes.io/projected/2c240bb9-703d-46d4-81b8-6f733dac4d9d-kube-api-access-6dkm4\") pod \"2c240bb9-703d-46d4-81b8-6f733dac4d9d\" (UID: \"2c240bb9-703d-46d4-81b8-6f733dac4d9d\") " Feb 25 11:30:05 crc kubenswrapper[4725]: I0225 11:30:05.563954 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c240bb9-703d-46d4-81b8-6f733dac4d9d-kube-api-access-6dkm4" (OuterVolumeSpecName: "kube-api-access-6dkm4") pod "2c240bb9-703d-46d4-81b8-6f733dac4d9d" (UID: "2c240bb9-703d-46d4-81b8-6f733dac4d9d"). InnerVolumeSpecName "kube-api-access-6dkm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:30:05 crc kubenswrapper[4725]: I0225 11:30:05.661839 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dkm4\" (UniqueName: \"kubernetes.io/projected/2c240bb9-703d-46d4-81b8-6f733dac4d9d-kube-api-access-6dkm4\") on node \"crc\" DevicePath \"\"" Feb 25 11:30:05 crc kubenswrapper[4725]: W0225 11:30:05.729851 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ad791d_ed98_4fc4_b305_0ba339bf37a1.slice/crio-7c01e6dc5d2651c00480e5a4bc04fd9eb9cb1e5bf88db035b731b27a12e3095b WatchSource:0}: Error finding container 7c01e6dc5d2651c00480e5a4bc04fd9eb9cb1e5bf88db035b731b27a12e3095b: Status 404 returned error can't find the container with id 7c01e6dc5d2651c00480e5a4bc04fd9eb9cb1e5bf88db035b731b27a12e3095b Feb 25 11:30:05 crc kubenswrapper[4725]: I0225 11:30:05.738271 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lf6t"] Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.100247 4725 generic.go:334] "Generic (PLEG): container finished" podID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerID="92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12" exitCode=0 Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.100318 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lf6t" event={"ID":"c3ad791d-ed98-4fc4-b305-0ba339bf37a1","Type":"ContainerDied","Data":"92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12"} Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.100548 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lf6t" event={"ID":"c3ad791d-ed98-4fc4-b305-0ba339bf37a1","Type":"ContainerStarted","Data":"7c01e6dc5d2651c00480e5a4bc04fd9eb9cb1e5bf88db035b731b27a12e3095b"} Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.103770 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533650-xfh7n" event={"ID":"2c240bb9-703d-46d4-81b8-6f733dac4d9d","Type":"ContainerDied","Data":"2e57b18d64bce2f3d2ee6ceca3bafdd8b4f1fa6d80d76ca81c31ffbaa7c70536"} Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.103798 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e57b18d64bce2f3d2ee6ceca3bafdd8b4f1fa6d80d76ca81c31ffbaa7c70536" Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.103856 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533650-xfh7n" Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.519272 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533644-2jhmj"] Feb 25 11:30:06 crc kubenswrapper[4725]: I0225 11:30:06.526805 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533644-2jhmj"] Feb 25 11:30:07 crc kubenswrapper[4725]: I0225 11:30:07.115049 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lf6t" event={"ID":"c3ad791d-ed98-4fc4-b305-0ba339bf37a1","Type":"ContainerStarted","Data":"bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b"} Feb 25 11:30:07 crc kubenswrapper[4725]: I0225 11:30:07.242469 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07212026-7350-41cd-8012-c427bd678f3c" path="/var/lib/kubelet/pods/07212026-7350-41cd-8012-c427bd678f3c/volumes" Feb 25 11:30:08 crc kubenswrapper[4725]: I0225 11:30:08.125247 4725 generic.go:334] "Generic (PLEG): container finished" podID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerID="bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b" exitCode=0 Feb 25 11:30:08 crc kubenswrapper[4725]: I0225 11:30:08.125312 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lf6t" event={"ID":"c3ad791d-ed98-4fc4-b305-0ba339bf37a1","Type":"ContainerDied","Data":"bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b"} Feb 25 11:30:09 crc kubenswrapper[4725]: I0225 11:30:09.148260 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lf6t" event={"ID":"c3ad791d-ed98-4fc4-b305-0ba339bf37a1","Type":"ContainerStarted","Data":"4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c"} Feb 25 11:30:09 crc kubenswrapper[4725]: I0225 11:30:09.175923 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6lf6t" podStartSLOduration=2.755675986 podStartE2EDuration="5.1759029s" podCreationTimestamp="2026-02-25 11:30:04 +0000 UTC" firstStartedPulling="2026-02-25 11:30:06.103090173 +0000 UTC m=+2231.601672238" lastFinishedPulling="2026-02-25 11:30:08.523317127 +0000 UTC m=+2234.021899152" observedRunningTime="2026-02-25 11:30:09.173691821 +0000 UTC m=+2234.672273846" watchObservedRunningTime="2026-02-25 11:30:09.1759029 +0000 UTC m=+2234.674484935" Feb 25 11:30:15 crc kubenswrapper[4725]: I0225 11:30:15.261578 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:15 crc kubenswrapper[4725]: I0225 11:30:15.262234 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:15 crc kubenswrapper[4725]: I0225 11:30:15.349533 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:16 crc kubenswrapper[4725]: I0225 11:30:16.285521 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:16 crc kubenswrapper[4725]: I0225 11:30:16.354218 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lf6t"] Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.244128 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6lf6t" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="registry-server" containerID="cri-o://4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c" gracePeriod=2 Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.773408 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.852399 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-utilities\") pod \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.852556 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-catalog-content\") pod \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.852687 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d28l4\" (UniqueName: \"kubernetes.io/projected/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-kube-api-access-d28l4\") pod \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\" (UID: \"c3ad791d-ed98-4fc4-b305-0ba339bf37a1\") " Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.853626 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-utilities" (OuterVolumeSpecName: "utilities") pod "c3ad791d-ed98-4fc4-b305-0ba339bf37a1" (UID: "c3ad791d-ed98-4fc4-b305-0ba339bf37a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.857991 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-kube-api-access-d28l4" (OuterVolumeSpecName: "kube-api-access-d28l4") pod "c3ad791d-ed98-4fc4-b305-0ba339bf37a1" (UID: "c3ad791d-ed98-4fc4-b305-0ba339bf37a1"). InnerVolumeSpecName "kube-api-access-d28l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.874449 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3ad791d-ed98-4fc4-b305-0ba339bf37a1" (UID: "c3ad791d-ed98-4fc4-b305-0ba339bf37a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.954992 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.955025 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:30:18 crc kubenswrapper[4725]: I0225 11:30:18.955040 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d28l4\" (UniqueName: \"kubernetes.io/projected/c3ad791d-ed98-4fc4-b305-0ba339bf37a1-kube-api-access-d28l4\") on node \"crc\" DevicePath \"\"" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.255179 4725 generic.go:334] "Generic (PLEG): container finished" podID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerID="4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c" exitCode=0 Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.255229 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lf6t" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.255237 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lf6t" event={"ID":"c3ad791d-ed98-4fc4-b305-0ba339bf37a1","Type":"ContainerDied","Data":"4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c"} Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.255383 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lf6t" event={"ID":"c3ad791d-ed98-4fc4-b305-0ba339bf37a1","Type":"ContainerDied","Data":"7c01e6dc5d2651c00480e5a4bc04fd9eb9cb1e5bf88db035b731b27a12e3095b"} Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.255405 4725 scope.go:117] "RemoveContainer" containerID="4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.300508 4725 scope.go:117] "RemoveContainer" containerID="bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.302165 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lf6t"] Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.309523 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lf6t"] Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.343152 4725 scope.go:117] "RemoveContainer" containerID="92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.415934 4725 scope.go:117] "RemoveContainer" containerID="4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c" Feb 25 11:30:19 crc kubenswrapper[4725]: E0225 11:30:19.417610 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c\": container with ID starting with 4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c not found: ID does not exist" containerID="4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.417659 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c"} err="failed to get container status \"4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c\": rpc error: code = NotFound desc = could not find container \"4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c\": container with ID starting with 4e8b87a6be32c842297483a4f9e0d0f004f99436c13fdc634b38e5904660892c not found: ID does not exist" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.417695 4725 scope.go:117] "RemoveContainer" containerID="bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b" Feb 25 11:30:19 crc kubenswrapper[4725]: E0225 11:30:19.418373 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b\": container with ID starting with bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b not found: ID does not exist" containerID="bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.418430 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b"} err="failed to get container status \"bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b\": rpc error: code = NotFound desc = could not find container \"bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b\": container with ID starting with bfe578647071f090a1e7e1b20eb4b343291d542b89e1d721dc831be794faa72b not found: ID does not exist" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.418467 4725 scope.go:117] "RemoveContainer" containerID="92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12" Feb 25 11:30:19 crc kubenswrapper[4725]: E0225 11:30:19.418894 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12\": container with ID starting with 92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12 not found: ID does not exist" containerID="92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12" Feb 25 11:30:19 crc kubenswrapper[4725]: I0225 11:30:19.418937 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12"} err="failed to get container status \"92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12\": rpc error: code = NotFound desc = could not find container \"92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12\": container with ID starting with 92c64cbd100bb630a5489fd4ceceb1601d07901878ca9df4e60b9c2781ae3b12 not found: ID does not exist" Feb 25 11:30:20 crc kubenswrapper[4725]: I0225 11:30:20.298623 4725 scope.go:117] "RemoveContainer" containerID="32fe34383390b195eab6a2d1793b396e86bbfbb9b05b4b9c711cdc833de263d7" Feb 25 11:30:20 crc kubenswrapper[4725]: I0225 11:30:20.364729 4725 scope.go:117] "RemoveContainer" containerID="9efa1097b38368bb85aa4b081c9f8cb61478e622441ab1602bfa0088065f26de" Feb 25 11:30:21 crc kubenswrapper[4725]: I0225 11:30:21.236960 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" path="/var/lib/kubelet/pods/c3ad791d-ed98-4fc4-b305-0ba339bf37a1/volumes" Feb 25 11:30:41 crc kubenswrapper[4725]: I0225 11:30:41.556086 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:30:41 crc kubenswrapper[4725]: I0225 11:30:41.556731 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:31:11 crc kubenswrapper[4725]: I0225 11:31:11.556291 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:31:11 crc kubenswrapper[4725]: I0225 11:31:11.557004 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:31:41 crc kubenswrapper[4725]: I0225 11:31:41.555371 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:31:41 crc kubenswrapper[4725]: I0225 11:31:41.556051 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:31:41 crc kubenswrapper[4725]: I0225 11:31:41.556117 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:31:41 crc kubenswrapper[4725]: I0225 11:31:41.557259 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:31:41 crc kubenswrapper[4725]: I0225 11:31:41.557367 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" gracePeriod=600 Feb 25 11:31:41 crc kubenswrapper[4725]: E0225 11:31:41.679313 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:31:42 crc kubenswrapper[4725]: I0225 11:31:42.164067 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" exitCode=0 Feb 25 11:31:42 crc kubenswrapper[4725]: I0225 11:31:42.164123 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378"} Feb 25 11:31:42 crc kubenswrapper[4725]: I0225 11:31:42.164165 4725 scope.go:117] "RemoveContainer" containerID="add76c268fa48b85bd8b4a73353a88415ac719328ee98d349951379413d37c8f" Feb 25 11:31:42 crc kubenswrapper[4725]: I0225 11:31:42.165056 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:31:42 crc kubenswrapper[4725]: E0225 11:31:42.165290 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:31:43 crc kubenswrapper[4725]: I0225 11:31:43.161570 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-58868cbfd5-pvwdv" podUID="4971206d-e6f2-4355-8c47-9a7c9e1e51d6" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 25 11:31:54 crc kubenswrapper[4725]: I0225 11:31:54.224928 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:31:54 crc kubenswrapper[4725]: E0225 11:31:54.226114 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.169978 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533652-fpx7l"] Feb 25 11:32:00 crc kubenswrapper[4725]: E0225 11:32:00.171631 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c240bb9-703d-46d4-81b8-6f733dac4d9d" containerName="oc" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.171669 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c240bb9-703d-46d4-81b8-6f733dac4d9d" containerName="oc" Feb 25 11:32:00 crc kubenswrapper[4725]: E0225 11:32:00.171718 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="extract-utilities" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.171736 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="extract-utilities" Feb 25 11:32:00 crc kubenswrapper[4725]: E0225 11:32:00.171782 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="registry-server" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.171800 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="registry-server" Feb 25 11:32:00 crc kubenswrapper[4725]: E0225 11:32:00.171824 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="extract-content" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.171883 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="extract-content" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.172373 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c240bb9-703d-46d4-81b8-6f733dac4d9d" containerName="oc" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.172414 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ad791d-ed98-4fc4-b305-0ba339bf37a1" containerName="registry-server" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.173778 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.177706 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.178037 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.178235 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.184396 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533652-fpx7l"] Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.272004 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqdc8\" (UniqueName: \"kubernetes.io/projected/be6b69e2-5209-4fc0-8622-2e1b9dae914f-kube-api-access-mqdc8\") pod \"auto-csr-approver-29533652-fpx7l\" (UID: \"be6b69e2-5209-4fc0-8622-2e1b9dae914f\") " pod="openshift-infra/auto-csr-approver-29533652-fpx7l" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.375881 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqdc8\" (UniqueName: \"kubernetes.io/projected/be6b69e2-5209-4fc0-8622-2e1b9dae914f-kube-api-access-mqdc8\") pod \"auto-csr-approver-29533652-fpx7l\" (UID: \"be6b69e2-5209-4fc0-8622-2e1b9dae914f\") " pod="openshift-infra/auto-csr-approver-29533652-fpx7l" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.412760 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqdc8\" (UniqueName: \"kubernetes.io/projected/be6b69e2-5209-4fc0-8622-2e1b9dae914f-kube-api-access-mqdc8\") pod \"auto-csr-approver-29533652-fpx7l\" (UID: \"be6b69e2-5209-4fc0-8622-2e1b9dae914f\") " pod="openshift-infra/auto-csr-approver-29533652-fpx7l" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.506176 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" Feb 25 11:32:00 crc kubenswrapper[4725]: I0225 11:32:00.815760 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533652-fpx7l"] Feb 25 11:32:01 crc kubenswrapper[4725]: I0225 11:32:01.377453 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" event={"ID":"be6b69e2-5209-4fc0-8622-2e1b9dae914f","Type":"ContainerStarted","Data":"416f711f55c5acfecab0493062d09d8e3ff0ee51ae80d48a9fcb36f48bae640e"} Feb 25 11:32:02 crc kubenswrapper[4725]: I0225 11:32:02.391627 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" event={"ID":"be6b69e2-5209-4fc0-8622-2e1b9dae914f","Type":"ContainerStarted","Data":"167494c8facf58d7dada5918c139f23d49916f338211a74adb8e69380a93d07c"} Feb 25 11:32:02 crc kubenswrapper[4725]: I0225 11:32:02.416011 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" podStartSLOduration=1.362984536 podStartE2EDuration="2.415983892s" podCreationTimestamp="2026-02-25 11:32:00 +0000 UTC" firstStartedPulling="2026-02-25 11:32:00.817908855 +0000 UTC m=+2346.316490880" lastFinishedPulling="2026-02-25 11:32:01.870908201 +0000 UTC m=+2347.369490236" observedRunningTime="2026-02-25 11:32:02.405322146 +0000 UTC m=+2347.903904221" watchObservedRunningTime="2026-02-25 11:32:02.415983892 +0000 UTC m=+2347.914565937" Feb 25 11:32:03 crc kubenswrapper[4725]: I0225 11:32:03.404501 4725 generic.go:334] "Generic (PLEG): container finished" podID="be6b69e2-5209-4fc0-8622-2e1b9dae914f" containerID="167494c8facf58d7dada5918c139f23d49916f338211a74adb8e69380a93d07c" exitCode=0 Feb 25 11:32:03 crc kubenswrapper[4725]: I0225 11:32:03.404543 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" event={"ID":"be6b69e2-5209-4fc0-8622-2e1b9dae914f","Type":"ContainerDied","Data":"167494c8facf58d7dada5918c139f23d49916f338211a74adb8e69380a93d07c"} Feb 25 11:32:04 crc kubenswrapper[4725]: I0225 11:32:04.901248 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.095311 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqdc8\" (UniqueName: \"kubernetes.io/projected/be6b69e2-5209-4fc0-8622-2e1b9dae914f-kube-api-access-mqdc8\") pod \"be6b69e2-5209-4fc0-8622-2e1b9dae914f\" (UID: \"be6b69e2-5209-4fc0-8622-2e1b9dae914f\") " Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.102079 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6b69e2-5209-4fc0-8622-2e1b9dae914f-kube-api-access-mqdc8" (OuterVolumeSpecName: "kube-api-access-mqdc8") pod "be6b69e2-5209-4fc0-8622-2e1b9dae914f" (UID: "be6b69e2-5209-4fc0-8622-2e1b9dae914f"). InnerVolumeSpecName "kube-api-access-mqdc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.197421 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqdc8\" (UniqueName: \"kubernetes.io/projected/be6b69e2-5209-4fc0-8622-2e1b9dae914f-kube-api-access-mqdc8\") on node \"crc\" DevicePath \"\"" Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.428396 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" event={"ID":"be6b69e2-5209-4fc0-8622-2e1b9dae914f","Type":"ContainerDied","Data":"416f711f55c5acfecab0493062d09d8e3ff0ee51ae80d48a9fcb36f48bae640e"} Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.428437 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="416f711f55c5acfecab0493062d09d8e3ff0ee51ae80d48a9fcb36f48bae640e" Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.428470 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533652-fpx7l" Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.500975 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533646-rtfb8"] Feb 25 11:32:05 crc kubenswrapper[4725]: I0225 11:32:05.511037 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533646-rtfb8"] Feb 25 11:32:07 crc kubenswrapper[4725]: I0225 11:32:07.237414 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd0fea4-28ef-4a6e-8b5b-7d137842723d" path="/var/lib/kubelet/pods/4cd0fea4-28ef-4a6e-8b5b-7d137842723d/volumes" Feb 25 11:32:08 crc kubenswrapper[4725]: I0225 11:32:08.224274 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:32:08 crc kubenswrapper[4725]: E0225 11:32:08.224994 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:32:20 crc kubenswrapper[4725]: I0225 11:32:20.224932 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:32:20 crc kubenswrapper[4725]: E0225 11:32:20.226268 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:32:20 crc kubenswrapper[4725]: I0225 11:32:20.506058 4725 scope.go:117] "RemoveContainer" containerID="88dae455f946b82bc12405b15fe9b7803b8d1088a3c30cb18c6e6f9f926dfa11" Feb 25 11:32:34 crc kubenswrapper[4725]: I0225 11:32:34.224915 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:32:34 crc kubenswrapper[4725]: E0225 11:32:34.225969 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:32:49 crc kubenswrapper[4725]: I0225 11:32:49.224817 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:32:49 crc kubenswrapper[4725]: E0225 11:32:49.225633 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:33:00 crc kubenswrapper[4725]: I0225 11:33:00.225046 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:33:00 crc kubenswrapper[4725]: E0225 11:33:00.226317 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:33:09 crc kubenswrapper[4725]: I0225 11:33:09.155822 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c225171-2b3a-414b-94d4-d73cc4d28b97" containerID="e5df1b56b38b80110cd37aafa6a07afafb20b221d406b90f3ca3c2dd7e39a492" exitCode=0 Feb 25 11:33:09 crc kubenswrapper[4725]: I0225 11:33:09.155891 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" event={"ID":"6c225171-2b3a-414b-94d4-d73cc4d28b97","Type":"ContainerDied","Data":"e5df1b56b38b80110cd37aafa6a07afafb20b221d406b90f3ca3c2dd7e39a492"} Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.732010 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.849078 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2mzg\" (UniqueName: \"kubernetes.io/projected/6c225171-2b3a-414b-94d4-d73cc4d28b97-kube-api-access-r2mzg\") pod \"6c225171-2b3a-414b-94d4-d73cc4d28b97\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.849201 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-ssh-key-openstack-edpm-ipam\") pod \"6c225171-2b3a-414b-94d4-d73cc4d28b97\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.849281 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-inventory\") pod \"6c225171-2b3a-414b-94d4-d73cc4d28b97\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.849472 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-combined-ca-bundle\") pod \"6c225171-2b3a-414b-94d4-d73cc4d28b97\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.849549 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-secret-0\") pod \"6c225171-2b3a-414b-94d4-d73cc4d28b97\" (UID: \"6c225171-2b3a-414b-94d4-d73cc4d28b97\") " Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.855018 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6c225171-2b3a-414b-94d4-d73cc4d28b97" (UID: "6c225171-2b3a-414b-94d4-d73cc4d28b97"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.855210 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c225171-2b3a-414b-94d4-d73cc4d28b97-kube-api-access-r2mzg" (OuterVolumeSpecName: "kube-api-access-r2mzg") pod "6c225171-2b3a-414b-94d4-d73cc4d28b97" (UID: "6c225171-2b3a-414b-94d4-d73cc4d28b97"). InnerVolumeSpecName "kube-api-access-r2mzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.876534 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c225171-2b3a-414b-94d4-d73cc4d28b97" (UID: "6c225171-2b3a-414b-94d4-d73cc4d28b97"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.884174 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-inventory" (OuterVolumeSpecName: "inventory") pod "6c225171-2b3a-414b-94d4-d73cc4d28b97" (UID: "6c225171-2b3a-414b-94d4-d73cc4d28b97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.891493 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6c225171-2b3a-414b-94d4-d73cc4d28b97" (UID: "6c225171-2b3a-414b-94d4-d73cc4d28b97"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.951950 4725 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.951980 4725 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.951995 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2mzg\" (UniqueName: \"kubernetes.io/projected/6c225171-2b3a-414b-94d4-d73cc4d28b97-kube-api-access-r2mzg\") on node \"crc\" DevicePath \"\"" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.952008 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:33:10 crc kubenswrapper[4725]: I0225 11:33:10.952020 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c225171-2b3a-414b-94d4-d73cc4d28b97-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.180885 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" event={"ID":"6c225171-2b3a-414b-94d4-d73cc4d28b97","Type":"ContainerDied","Data":"aa22a4b3519a6684a9aa26dbb3b1a1fe42def6b01430d35747515741b51acc10"} Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.180960 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa22a4b3519a6684a9aa26dbb3b1a1fe42def6b01430d35747515741b51acc10" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.181310 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.367643 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7"] Feb 25 11:33:11 crc kubenswrapper[4725]: E0225 11:33:11.368154 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c225171-2b3a-414b-94d4-d73cc4d28b97" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.368181 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c225171-2b3a-414b-94d4-d73cc4d28b97" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 11:33:11 crc kubenswrapper[4725]: E0225 11:33:11.368221 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6b69e2-5209-4fc0-8622-2e1b9dae914f" containerName="oc" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.368229 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6b69e2-5209-4fc0-8622-2e1b9dae914f" containerName="oc" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.368469 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6b69e2-5209-4fc0-8622-2e1b9dae914f" containerName="oc" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.368491 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c225171-2b3a-414b-94d4-d73cc4d28b97" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.369280 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.376469 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.376675 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.376820 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.376948 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.377060 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.377280 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.377483 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.390809 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7"] Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.464856 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.464912 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.464959 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.464988 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.465017 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.465052 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.465077 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.465111 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.465139 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrbn\" (UniqueName: \"kubernetes.io/projected/4c1ac37f-ee50-4446-8433-5c3f1c427205-kube-api-access-nzrbn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.465163 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.465206 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567320 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567397 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567463 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567503 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrbn\" (UniqueName: \"kubernetes.io/projected/4c1ac37f-ee50-4446-8433-5c3f1c427205-kube-api-access-nzrbn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567533 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567641 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567920 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.567966 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.568099 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.568156 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.568193 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.569233 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.572688 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.574085 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.574662 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.574807 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.575171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.575206 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.576277 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.579353 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.579487 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.593667 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrbn\" (UniqueName: \"kubernetes.io/projected/4c1ac37f-ee50-4446-8433-5c3f1c427205-kube-api-access-nzrbn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n8lt7\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:11 crc kubenswrapper[4725]: I0225 11:33:11.694998 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:33:12 crc kubenswrapper[4725]: I0225 11:33:12.226547 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:33:12 crc kubenswrapper[4725]: E0225 11:33:12.227997 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:33:12 crc kubenswrapper[4725]: I0225 11:33:12.257033 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7"] Feb 25 11:33:12 crc kubenswrapper[4725]: I0225 11:33:12.314195 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:33:13 crc kubenswrapper[4725]: I0225 11:33:13.208082 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" event={"ID":"4c1ac37f-ee50-4446-8433-5c3f1c427205","Type":"ContainerStarted","Data":"64a869573e317d712e88f9257f812daa600d151120427d13aeb3cb36336934ab"} Feb 25 11:33:13 crc kubenswrapper[4725]: I0225 11:33:13.208487 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" event={"ID":"4c1ac37f-ee50-4446-8433-5c3f1c427205","Type":"ContainerStarted","Data":"669004a0185ee3bb9b60ced37df9c68c7c7d90d2309c8b4ad4d396ac88d2cdc1"} Feb 25 11:33:13 crc kubenswrapper[4725]: I0225 11:33:13.267367 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" podStartSLOduration=1.799673285 podStartE2EDuration="2.267340295s" podCreationTimestamp="2026-02-25 11:33:11 +0000 UTC" firstStartedPulling="2026-02-25 11:33:12.313800849 +0000 UTC m=+2417.812382884" lastFinishedPulling="2026-02-25 11:33:12.781467839 +0000 UTC m=+2418.280049894" observedRunningTime="2026-02-25 11:33:13.267061887 +0000 UTC m=+2418.765643932" watchObservedRunningTime="2026-02-25 11:33:13.267340295 +0000 UTC m=+2418.765922360" Feb 25 11:33:27 crc kubenswrapper[4725]: I0225 11:33:27.226158 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:33:27 crc kubenswrapper[4725]: E0225 11:33:27.226925 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:33:42 crc kubenswrapper[4725]: I0225 11:33:42.225464 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:33:42 crc kubenswrapper[4725]: E0225 11:33:42.226616 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:33:55 crc kubenswrapper[4725]: I0225 11:33:55.235587 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:33:55 crc kubenswrapper[4725]: E0225 11:33:55.236449 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.176264 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533654-d89m7"] Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.179727 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533654-d89m7" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.186048 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.186072 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.189881 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74pc\" (UniqueName: \"kubernetes.io/projected/efa77d59-616e-430f-b790-a0fcbfc53b73-kube-api-access-z74pc\") pod \"auto-csr-approver-29533654-d89m7\" (UID: \"efa77d59-616e-430f-b790-a0fcbfc53b73\") " pod="openshift-infra/auto-csr-approver-29533654-d89m7" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.190496 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.194436 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533654-d89m7"] Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.292304 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74pc\" (UniqueName: \"kubernetes.io/projected/efa77d59-616e-430f-b790-a0fcbfc53b73-kube-api-access-z74pc\") pod \"auto-csr-approver-29533654-d89m7\" (UID: \"efa77d59-616e-430f-b790-a0fcbfc53b73\") " pod="openshift-infra/auto-csr-approver-29533654-d89m7" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.315376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74pc\" (UniqueName: \"kubernetes.io/projected/efa77d59-616e-430f-b790-a0fcbfc53b73-kube-api-access-z74pc\") pod \"auto-csr-approver-29533654-d89m7\" (UID: \"efa77d59-616e-430f-b790-a0fcbfc53b73\") " pod="openshift-infra/auto-csr-approver-29533654-d89m7" Feb 25 11:34:00 crc kubenswrapper[4725]: I0225 11:34:00.516754 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533654-d89m7" Feb 25 11:34:01 crc kubenswrapper[4725]: I0225 11:34:01.049899 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533654-d89m7"] Feb 25 11:34:01 crc kubenswrapper[4725]: I0225 11:34:01.744265 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533654-d89m7" event={"ID":"efa77d59-616e-430f-b790-a0fcbfc53b73","Type":"ContainerStarted","Data":"2b03a993374678132ee3f09563ae53deb6d1deab364289e74a241fb70d64fe24"} Feb 25 11:34:02 crc kubenswrapper[4725]: I0225 11:34:02.752526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533654-d89m7" event={"ID":"efa77d59-616e-430f-b790-a0fcbfc53b73","Type":"ContainerStarted","Data":"9120a675830f20e7616c6e03e5ad6ccc09d11df2709df7aa36ab8b18ecb7ff55"} Feb 25 11:34:02 crc kubenswrapper[4725]: I0225 11:34:02.786245 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533654-d89m7" podStartSLOduration=1.660965249 podStartE2EDuration="2.786218387s" podCreationTimestamp="2026-02-25 11:34:00 +0000 UTC" firstStartedPulling="2026-02-25 11:34:01.044769146 +0000 UTC m=+2466.543351201" lastFinishedPulling="2026-02-25 11:34:02.170022274 +0000 UTC m=+2467.668604339" observedRunningTime="2026-02-25 11:34:02.773815416 +0000 UTC m=+2468.272397431" watchObservedRunningTime="2026-02-25 11:34:02.786218387 +0000 UTC m=+2468.284800452" Feb 25 11:34:03 crc kubenswrapper[4725]: I0225 11:34:03.769190 4725 generic.go:334] "Generic (PLEG): container finished" podID="efa77d59-616e-430f-b790-a0fcbfc53b73" containerID="9120a675830f20e7616c6e03e5ad6ccc09d11df2709df7aa36ab8b18ecb7ff55" exitCode=0 Feb 25 11:34:03 crc kubenswrapper[4725]: I0225 11:34:03.769295 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533654-d89m7" event={"ID":"efa77d59-616e-430f-b790-a0fcbfc53b73","Type":"ContainerDied","Data":"9120a675830f20e7616c6e03e5ad6ccc09d11df2709df7aa36ab8b18ecb7ff55"} Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.291000 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533654-d89m7" Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.427315 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z74pc\" (UniqueName: \"kubernetes.io/projected/efa77d59-616e-430f-b790-a0fcbfc53b73-kube-api-access-z74pc\") pod \"efa77d59-616e-430f-b790-a0fcbfc53b73\" (UID: \"efa77d59-616e-430f-b790-a0fcbfc53b73\") " Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.433069 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa77d59-616e-430f-b790-a0fcbfc53b73-kube-api-access-z74pc" (OuterVolumeSpecName: "kube-api-access-z74pc") pod "efa77d59-616e-430f-b790-a0fcbfc53b73" (UID: "efa77d59-616e-430f-b790-a0fcbfc53b73"). InnerVolumeSpecName "kube-api-access-z74pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.529615 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z74pc\" (UniqueName: \"kubernetes.io/projected/efa77d59-616e-430f-b790-a0fcbfc53b73-kube-api-access-z74pc\") on node \"crc\" DevicePath \"\"" Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.792622 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533654-d89m7" event={"ID":"efa77d59-616e-430f-b790-a0fcbfc53b73","Type":"ContainerDied","Data":"2b03a993374678132ee3f09563ae53deb6d1deab364289e74a241fb70d64fe24"} Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.792659 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b03a993374678132ee3f09563ae53deb6d1deab364289e74a241fb70d64fe24" Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.792744 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533654-d89m7" Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.843970 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533648-pdctk"] Feb 25 11:34:05 crc kubenswrapper[4725]: I0225 11:34:05.851298 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533648-pdctk"] Feb 25 11:34:07 crc kubenswrapper[4725]: I0225 11:34:07.242611 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09858624-9ec4-4226-b86d-c6fc95b91ba9" path="/var/lib/kubelet/pods/09858624-9ec4-4226-b86d-c6fc95b91ba9/volumes" Feb 25 11:34:09 crc kubenswrapper[4725]: I0225 11:34:09.224453 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:34:09 crc kubenswrapper[4725]: E0225 11:34:09.225329 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:34:20 crc kubenswrapper[4725]: I0225 11:34:20.663686 4725 scope.go:117] "RemoveContainer" containerID="36fc4f5ac8d7b9bc2a7afb4209b58c4a9bc204b872b1d5d385ca72af86f632b3" Feb 25 11:34:24 crc kubenswrapper[4725]: I0225 11:34:24.224242 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:34:24 crc kubenswrapper[4725]: E0225 11:34:24.224821 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:34:37 crc kubenswrapper[4725]: I0225 11:34:37.225526 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:34:37 crc kubenswrapper[4725]: E0225 11:34:37.226557 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.544468 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-89qzk"] Feb 25 11:34:43 crc kubenswrapper[4725]: E0225 11:34:43.545938 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa77d59-616e-430f-b790-a0fcbfc53b73" containerName="oc" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.545967 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa77d59-616e-430f-b790-a0fcbfc53b73" containerName="oc" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.546408 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa77d59-616e-430f-b790-a0fcbfc53b73" containerName="oc" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.549708 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.558062 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89qzk"] Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.641958 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-utilities\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.642124 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqtm2\" (UniqueName: \"kubernetes.io/projected/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-kube-api-access-vqtm2\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.642228 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-catalog-content\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.744334 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-utilities\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.744462 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqtm2\" (UniqueName: \"kubernetes.io/projected/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-kube-api-access-vqtm2\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.744523 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-catalog-content\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.745020 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-utilities\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.745104 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-catalog-content\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.770930 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqtm2\" (UniqueName: \"kubernetes.io/projected/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-kube-api-access-vqtm2\") pod \"redhat-operators-89qzk\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:43 crc kubenswrapper[4725]: I0225 11:34:43.910821 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:44 crc kubenswrapper[4725]: I0225 11:34:44.378761 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89qzk"] Feb 25 11:34:45 crc kubenswrapper[4725]: I0225 11:34:45.286113 4725 generic.go:334] "Generic (PLEG): container finished" podID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerID="c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1" exitCode=0 Feb 25 11:34:45 crc kubenswrapper[4725]: I0225 11:34:45.286472 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89qzk" event={"ID":"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a","Type":"ContainerDied","Data":"c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1"} Feb 25 11:34:45 crc kubenswrapper[4725]: I0225 11:34:45.286511 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89qzk" event={"ID":"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a","Type":"ContainerStarted","Data":"32bbb6caed612627da1362158445028c9cb5440416312de9306c907eb915c3d4"} Feb 25 11:34:46 crc kubenswrapper[4725]: I0225 11:34:46.299536 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89qzk" event={"ID":"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a","Type":"ContainerStarted","Data":"090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a"} Feb 25 11:34:47 crc kubenswrapper[4725]: I0225 11:34:47.315047 4725 generic.go:334] "Generic (PLEG): container finished" podID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerID="090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a" exitCode=0 Feb 25 11:34:47 crc kubenswrapper[4725]: I0225 11:34:47.315107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89qzk" event={"ID":"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a","Type":"ContainerDied","Data":"090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a"} Feb 25 11:34:48 crc kubenswrapper[4725]: I0225 11:34:48.323705 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89qzk" event={"ID":"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a","Type":"ContainerStarted","Data":"c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e"} Feb 25 11:34:48 crc kubenswrapper[4725]: I0225 11:34:48.354671 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-89qzk" podStartSLOduration=2.911632458 podStartE2EDuration="5.3546445s" podCreationTimestamp="2026-02-25 11:34:43 +0000 UTC" firstStartedPulling="2026-02-25 11:34:45.290620586 +0000 UTC m=+2510.789202651" lastFinishedPulling="2026-02-25 11:34:47.733632638 +0000 UTC m=+2513.232214693" observedRunningTime="2026-02-25 11:34:48.343564425 +0000 UTC m=+2513.842146480" watchObservedRunningTime="2026-02-25 11:34:48.3546445 +0000 UTC m=+2513.853226535" Feb 25 11:34:50 crc kubenswrapper[4725]: I0225 11:34:50.225489 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:34:50 crc kubenswrapper[4725]: E0225 11:34:50.226281 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:34:53 crc kubenswrapper[4725]: I0225 11:34:53.911665 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:53 crc kubenswrapper[4725]: I0225 11:34:53.911947 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:34:54 crc kubenswrapper[4725]: I0225 11:34:54.995957 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-89qzk" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="registry-server" probeResult="failure" output=< Feb 25 11:34:54 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 25 11:34:54 crc kubenswrapper[4725]: > Feb 25 11:34:55 crc kubenswrapper[4725]: I0225 11:34:55.808470 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-njqjm"] Feb 25 11:34:55 crc kubenswrapper[4725]: I0225 11:34:55.810535 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:55 crc kubenswrapper[4725]: I0225 11:34:55.846908 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-njqjm"] Feb 25 11:34:55 crc kubenswrapper[4725]: I0225 11:34:55.907937 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-catalog-content\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:55 crc kubenswrapper[4725]: I0225 11:34:55.908092 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-utilities\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:55 crc kubenswrapper[4725]: I0225 11:34:55.908162 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzljm\" (UniqueName: \"kubernetes.io/projected/3a4e2c96-634f-48ee-8906-da950daa746a-kube-api-access-kzljm\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.009346 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-utilities\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.009691 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzljm\" (UniqueName: \"kubernetes.io/projected/3a4e2c96-634f-48ee-8906-da950daa746a-kube-api-access-kzljm\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.009759 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-catalog-content\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.010376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-catalog-content\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.010373 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-utilities\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.030010 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzljm\" (UniqueName: \"kubernetes.io/projected/3a4e2c96-634f-48ee-8906-da950daa746a-kube-api-access-kzljm\") pod \"certified-operators-njqjm\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.148222 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:34:56 crc kubenswrapper[4725]: I0225 11:34:56.709432 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-njqjm"] Feb 25 11:34:57 crc kubenswrapper[4725]: I0225 11:34:57.422211 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerStarted","Data":"bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58"} Feb 25 11:34:57 crc kubenswrapper[4725]: I0225 11:34:57.422638 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerStarted","Data":"c25c2b6525a6b814663e17db3bc46d748ca3504a676f1f72aec028cb6ea9e435"} Feb 25 11:34:58 crc kubenswrapper[4725]: I0225 11:34:58.435467 4725 generic.go:334] "Generic (PLEG): container finished" podID="3a4e2c96-634f-48ee-8906-da950daa746a" containerID="bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58" exitCode=0 Feb 25 11:34:58 crc kubenswrapper[4725]: I0225 11:34:58.435510 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerDied","Data":"bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58"} Feb 25 11:34:59 crc kubenswrapper[4725]: I0225 11:34:59.452549 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerStarted","Data":"9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b"} Feb 25 11:35:00 crc kubenswrapper[4725]: I0225 11:35:00.477095 4725 generic.go:334] "Generic (PLEG): container finished" podID="3a4e2c96-634f-48ee-8906-da950daa746a" containerID="9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b" exitCode=0 Feb 25 11:35:00 crc kubenswrapper[4725]: I0225 11:35:00.477142 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerDied","Data":"9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b"} Feb 25 11:35:01 crc kubenswrapper[4725]: I0225 11:35:01.488047 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerStarted","Data":"d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e"} Feb 25 11:35:01 crc kubenswrapper[4725]: I0225 11:35:01.510700 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-njqjm" podStartSLOduration=4.076554527 podStartE2EDuration="6.510680903s" podCreationTimestamp="2026-02-25 11:34:55 +0000 UTC" firstStartedPulling="2026-02-25 11:34:58.439514448 +0000 UTC m=+2523.938096503" lastFinishedPulling="2026-02-25 11:35:00.873640854 +0000 UTC m=+2526.372222879" observedRunningTime="2026-02-25 11:35:01.508646848 +0000 UTC m=+2527.007228883" watchObservedRunningTime="2026-02-25 11:35:01.510680903 +0000 UTC m=+2527.009262938" Feb 25 11:35:03 crc kubenswrapper[4725]: I0225 11:35:03.994594 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:35:04 crc kubenswrapper[4725]: I0225 11:35:04.076533 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:35:05 crc kubenswrapper[4725]: I0225 11:35:05.190901 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89qzk"] Feb 25 11:35:05 crc kubenswrapper[4725]: I0225 11:35:05.233178 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:35:05 crc kubenswrapper[4725]: E0225 11:35:05.233783 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:35:05 crc kubenswrapper[4725]: I0225 11:35:05.534099 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-89qzk" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="registry-server" containerID="cri-o://c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e" gracePeriod=2 Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.107341 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.149017 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.149372 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.239451 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.273387 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-utilities\") pod \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.273530 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-catalog-content\") pod \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.273586 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqtm2\" (UniqueName: \"kubernetes.io/projected/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-kube-api-access-vqtm2\") pod \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\" (UID: \"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a\") " Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.275035 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-utilities" (OuterVolumeSpecName: "utilities") pod "bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" (UID: "bd6f34fd-03f3-4c2b-ac05-6628d6947e5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.276362 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.280457 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-kube-api-access-vqtm2" (OuterVolumeSpecName: "kube-api-access-vqtm2") pod "bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" (UID: "bd6f34fd-03f3-4c2b-ac05-6628d6947e5a"). InnerVolumeSpecName "kube-api-access-vqtm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.371603 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" (UID: "bd6f34fd-03f3-4c2b-ac05-6628d6947e5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.378748 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.379140 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqtm2\" (UniqueName: \"kubernetes.io/projected/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a-kube-api-access-vqtm2\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.550258 4725 generic.go:334] "Generic (PLEG): container finished" podID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerID="c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e" exitCode=0 Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.550333 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89qzk" event={"ID":"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a","Type":"ContainerDied","Data":"c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e"} Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.550377 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89qzk" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.550402 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89qzk" event={"ID":"bd6f34fd-03f3-4c2b-ac05-6628d6947e5a","Type":"ContainerDied","Data":"32bbb6caed612627da1362158445028c9cb5440416312de9306c907eb915c3d4"} Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.550435 4725 scope.go:117] "RemoveContainer" containerID="c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.603131 4725 scope.go:117] "RemoveContainer" containerID="090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.634191 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89qzk"] Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.642468 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-89qzk"] Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.648042 4725 scope.go:117] "RemoveContainer" containerID="c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.652997 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.708408 4725 scope.go:117] "RemoveContainer" containerID="c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e" Feb 25 11:35:06 crc kubenswrapper[4725]: E0225 11:35:06.709040 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e\": container with ID starting with c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e not found: ID does not exist" containerID="c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.709087 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e"} err="failed to get container status \"c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e\": rpc error: code = NotFound desc = could not find container \"c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e\": container with ID starting with c728db832bb5a7a8d0d903065c4a5780db118b71b708cccc5cd1e32dbd295d9e not found: ID does not exist" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.709118 4725 scope.go:117] "RemoveContainer" containerID="090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a" Feb 25 11:35:06 crc kubenswrapper[4725]: E0225 11:35:06.709463 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a\": container with ID starting with 090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a not found: ID does not exist" containerID="090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.709509 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a"} err="failed to get container status \"090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a\": rpc error: code = NotFound desc = could not find container \"090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a\": container with ID starting with 090e0fa13d17481385fc79552d8f954bdc5faacc3ae026f0a581c90ab0c9ac8a not found: ID does not exist" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.709537 4725 scope.go:117] "RemoveContainer" containerID="c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1" Feb 25 11:35:06 crc kubenswrapper[4725]: E0225 11:35:06.709915 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1\": container with ID starting with c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1 not found: ID does not exist" containerID="c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1" Feb 25 11:35:06 crc kubenswrapper[4725]: I0225 11:35:06.709947 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1"} err="failed to get container status \"c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1\": rpc error: code = NotFound desc = could not find container \"c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1\": container with ID starting with c57328182c77ad3dd3c1a5d14c9c93d17356999d24c40a26dc46f62eb9ff0ec1 not found: ID does not exist" Feb 25 11:35:07 crc kubenswrapper[4725]: I0225 11:35:07.244038 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" path="/var/lib/kubelet/pods/bd6f34fd-03f3-4c2b-ac05-6628d6947e5a/volumes" Feb 25 11:35:08 crc kubenswrapper[4725]: I0225 11:35:08.595985 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-njqjm"] Feb 25 11:35:09 crc kubenswrapper[4725]: I0225 11:35:09.587910 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-njqjm" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="registry-server" containerID="cri-o://d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e" gracePeriod=2 Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.171499 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.265177 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-utilities\") pod \"3a4e2c96-634f-48ee-8906-da950daa746a\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.265306 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzljm\" (UniqueName: \"kubernetes.io/projected/3a4e2c96-634f-48ee-8906-da950daa746a-kube-api-access-kzljm\") pod \"3a4e2c96-634f-48ee-8906-da950daa746a\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.265593 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-catalog-content\") pod \"3a4e2c96-634f-48ee-8906-da950daa746a\" (UID: \"3a4e2c96-634f-48ee-8906-da950daa746a\") " Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.266289 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-utilities" (OuterVolumeSpecName: "utilities") pod "3a4e2c96-634f-48ee-8906-da950daa746a" (UID: "3a4e2c96-634f-48ee-8906-da950daa746a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.271640 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4e2c96-634f-48ee-8906-da950daa746a-kube-api-access-kzljm" (OuterVolumeSpecName: "kube-api-access-kzljm") pod "3a4e2c96-634f-48ee-8906-da950daa746a" (UID: "3a4e2c96-634f-48ee-8906-da950daa746a"). InnerVolumeSpecName "kube-api-access-kzljm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.348038 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a4e2c96-634f-48ee-8906-da950daa746a" (UID: "3a4e2c96-634f-48ee-8906-da950daa746a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.368256 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.368306 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4e2c96-634f-48ee-8906-da950daa746a-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.368323 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzljm\" (UniqueName: \"kubernetes.io/projected/3a4e2c96-634f-48ee-8906-da950daa746a-kube-api-access-kzljm\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.601037 4725 generic.go:334] "Generic (PLEG): container finished" podID="3a4e2c96-634f-48ee-8906-da950daa746a" containerID="d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e" exitCode=0 Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.601122 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerDied","Data":"d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e"} Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.601182 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njqjm" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.601657 4725 scope.go:117] "RemoveContainer" containerID="d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.601524 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njqjm" event={"ID":"3a4e2c96-634f-48ee-8906-da950daa746a","Type":"ContainerDied","Data":"c25c2b6525a6b814663e17db3bc46d748ca3504a676f1f72aec028cb6ea9e435"} Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.632941 4725 scope.go:117] "RemoveContainer" containerID="9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.653074 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-njqjm"] Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.660621 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-njqjm"] Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.684326 4725 scope.go:117] "RemoveContainer" containerID="bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.726732 4725 scope.go:117] "RemoveContainer" containerID="d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e" Feb 25 11:35:10 crc kubenswrapper[4725]: E0225 11:35:10.727203 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e\": container with ID starting with d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e not found: ID does not exist" containerID="d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.727303 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e"} err="failed to get container status \"d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e\": rpc error: code = NotFound desc = could not find container \"d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e\": container with ID starting with d439bbe4c9da0b73391bb94be14f54bd1b88703c4e55cdce68fe35b31b98917e not found: ID does not exist" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.727385 4725 scope.go:117] "RemoveContainer" containerID="9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b" Feb 25 11:35:10 crc kubenswrapper[4725]: E0225 11:35:10.727964 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b\": container with ID starting with 9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b not found: ID does not exist" containerID="9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.728017 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b"} err="failed to get container status \"9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b\": rpc error: code = NotFound desc = could not find container \"9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b\": container with ID starting with 9d9aacd0fab1ee124e0eac428006988475b6e0bf02db79073289ba2eddfd925b not found: ID does not exist" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.728049 4725 scope.go:117] "RemoveContainer" containerID="bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58" Feb 25 11:35:10 crc kubenswrapper[4725]: E0225 11:35:10.728628 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58\": container with ID starting with bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58 not found: ID does not exist" containerID="bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58" Feb 25 11:35:10 crc kubenswrapper[4725]: I0225 11:35:10.728718 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58"} err="failed to get container status \"bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58\": rpc error: code = NotFound desc = could not find container \"bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58\": container with ID starting with bcd6eff69af86c55bc2eaf855d49c8bf464cca9f31c3a7b5a5129b227b7d6c58 not found: ID does not exist" Feb 25 11:35:11 crc kubenswrapper[4725]: I0225 11:35:11.243510 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" path="/var/lib/kubelet/pods/3a4e2c96-634f-48ee-8906-da950daa746a/volumes" Feb 25 11:35:20 crc kubenswrapper[4725]: I0225 11:35:20.224382 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:35:20 crc kubenswrapper[4725]: E0225 11:35:20.225574 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:35:35 crc kubenswrapper[4725]: I0225 11:35:35.237513 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:35:35 crc kubenswrapper[4725]: E0225 11:35:35.239794 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:35:47 crc kubenswrapper[4725]: I0225 11:35:47.224769 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:35:47 crc kubenswrapper[4725]: E0225 11:35:47.225663 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:35:49 crc kubenswrapper[4725]: I0225 11:35:49.131971 4725 generic.go:334] "Generic (PLEG): container finished" podID="4c1ac37f-ee50-4446-8433-5c3f1c427205" containerID="64a869573e317d712e88f9257f812daa600d151120427d13aeb3cb36336934ab" exitCode=0 Feb 25 11:35:49 crc kubenswrapper[4725]: I0225 11:35:49.132068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" event={"ID":"4c1ac37f-ee50-4446-8433-5c3f1c427205","Type":"ContainerDied","Data":"64a869573e317d712e88f9257f812daa600d151120427d13aeb3cb36336934ab"} Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.653433 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.768881 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzrbn\" (UniqueName: \"kubernetes.io/projected/4c1ac37f-ee50-4446-8433-5c3f1c427205-kube-api-access-nzrbn\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769189 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-ssh-key-openstack-edpm-ipam\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769226 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-3\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769244 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-2\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769312 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-extra-config-0\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769344 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-1\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769383 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-1\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769477 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-0\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769900 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-inventory\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769922 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-combined-ca-bundle\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.769980 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-0\") pod \"4c1ac37f-ee50-4446-8433-5c3f1c427205\" (UID: \"4c1ac37f-ee50-4446-8433-5c3f1c427205\") " Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.774757 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1ac37f-ee50-4446-8433-5c3f1c427205-kube-api-access-nzrbn" (OuterVolumeSpecName: "kube-api-access-nzrbn") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "kube-api-access-nzrbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.776059 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.797619 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.800027 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.800769 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.802656 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.805701 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.809774 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.812559 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.813223 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.815662 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-inventory" (OuterVolumeSpecName: "inventory") pod "4c1ac37f-ee50-4446-8433-5c3f1c427205" (UID: "4c1ac37f-ee50-4446-8433-5c3f1c427205"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872662 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872719 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872739 4725 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872758 4725 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872775 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzrbn\" (UniqueName: \"kubernetes.io/projected/4c1ac37f-ee50-4446-8433-5c3f1c427205-kube-api-access-nzrbn\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872791 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872805 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872817 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872866 4725 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872879 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:50 crc kubenswrapper[4725]: I0225 11:35:50.872892 4725 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4c1ac37f-ee50-4446-8433-5c3f1c427205-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.155746 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" event={"ID":"4c1ac37f-ee50-4446-8433-5c3f1c427205","Type":"ContainerDied","Data":"669004a0185ee3bb9b60ced37df9c68c7c7d90d2309c8b4ad4d396ac88d2cdc1"} Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.155798 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669004a0185ee3bb9b60ced37df9c68c7c7d90d2309c8b4ad4d396ac88d2cdc1" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.155896 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n8lt7" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.283710 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m"] Feb 25 11:35:51 crc kubenswrapper[4725]: E0225 11:35:51.285154 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="extract-utilities" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285177 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="extract-utilities" Feb 25 11:35:51 crc kubenswrapper[4725]: E0225 11:35:51.285192 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1ac37f-ee50-4446-8433-5c3f1c427205" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285199 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1ac37f-ee50-4446-8433-5c3f1c427205" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 11:35:51 crc kubenswrapper[4725]: E0225 11:35:51.285213 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="extract-content" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285220 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="extract-content" Feb 25 11:35:51 crc kubenswrapper[4725]: E0225 11:35:51.285261 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="registry-server" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285267 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="registry-server" Feb 25 11:35:51 crc kubenswrapper[4725]: E0225 11:35:51.285278 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="extract-content" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285284 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="extract-content" Feb 25 11:35:51 crc kubenswrapper[4725]: E0225 11:35:51.285293 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="extract-utilities" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285299 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="extract-utilities" Feb 25 11:35:51 crc kubenswrapper[4725]: E0225 11:35:51.285317 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="registry-server" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285323 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="registry-server" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285493 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1ac37f-ee50-4446-8433-5c3f1c427205" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285516 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6f34fd-03f3-4c2b-ac05-6628d6947e5a" containerName="registry-server" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.285533 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4e2c96-634f-48ee-8906-da950daa746a" containerName="registry-server" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.286101 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.288993 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.289282 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.289619 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4p75z" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.290383 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.297111 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.301409 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m"] Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.383581 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.383946 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.384110 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.384297 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.384577 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.384774 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.385023 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdwzq\" (UniqueName: \"kubernetes.io/projected/07f2c78c-f46d-4751-ae1b-ac502a378ff4-kube-api-access-cdwzq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.486605 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.487187 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.487232 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.487267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.487300 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.487325 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.487361 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdwzq\" (UniqueName: \"kubernetes.io/projected/07f2c78c-f46d-4751-ae1b-ac502a378ff4-kube-api-access-cdwzq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.490744 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.490899 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.492571 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.495095 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.495509 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.498053 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.504870 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdwzq\" (UniqueName: \"kubernetes.io/projected/07f2c78c-f46d-4751-ae1b-ac502a378ff4-kube-api-access-cdwzq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dg75m\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:51 crc kubenswrapper[4725]: I0225 11:35:51.614915 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:35:52 crc kubenswrapper[4725]: I0225 11:35:52.154120 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m"] Feb 25 11:35:53 crc kubenswrapper[4725]: I0225 11:35:53.175084 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" event={"ID":"07f2c78c-f46d-4751-ae1b-ac502a378ff4","Type":"ContainerStarted","Data":"f4afd590c82b2f16636eb17ba0ace498cbd335d0573c87cba3a30344ff988969"} Feb 25 11:35:53 crc kubenswrapper[4725]: I0225 11:35:53.175366 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" event={"ID":"07f2c78c-f46d-4751-ae1b-ac502a378ff4","Type":"ContainerStarted","Data":"f2b4a8f59366e013dc6ab63e747d1d1f34ea28fdd968418d1d81338f8486e1ad"} Feb 25 11:35:53 crc kubenswrapper[4725]: I0225 11:35:53.209923 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" podStartSLOduration=1.735949982 podStartE2EDuration="2.209895109s" podCreationTimestamp="2026-02-25 11:35:51 +0000 UTC" firstStartedPulling="2026-02-25 11:35:52.165068828 +0000 UTC m=+2577.663650853" lastFinishedPulling="2026-02-25 11:35:52.639013945 +0000 UTC m=+2578.137595980" observedRunningTime="2026-02-25 11:35:53.191017295 +0000 UTC m=+2578.689599320" watchObservedRunningTime="2026-02-25 11:35:53.209895109 +0000 UTC m=+2578.708477204" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.153030 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533656-s2snx"] Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.156453 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533656-s2snx" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.160425 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.160749 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.161010 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.172906 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533656-s2snx"] Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.226181 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:36:00 crc kubenswrapper[4725]: E0225 11:36:00.226732 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.303302 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68spx\" (UniqueName: \"kubernetes.io/projected/03fa4551-240f-4e9f-a3ba-ca8918695d0e-kube-api-access-68spx\") pod \"auto-csr-approver-29533656-s2snx\" (UID: \"03fa4551-240f-4e9f-a3ba-ca8918695d0e\") " pod="openshift-infra/auto-csr-approver-29533656-s2snx" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.408375 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68spx\" (UniqueName: \"kubernetes.io/projected/03fa4551-240f-4e9f-a3ba-ca8918695d0e-kube-api-access-68spx\") pod \"auto-csr-approver-29533656-s2snx\" (UID: \"03fa4551-240f-4e9f-a3ba-ca8918695d0e\") " pod="openshift-infra/auto-csr-approver-29533656-s2snx" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.446357 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68spx\" (UniqueName: \"kubernetes.io/projected/03fa4551-240f-4e9f-a3ba-ca8918695d0e-kube-api-access-68spx\") pod \"auto-csr-approver-29533656-s2snx\" (UID: \"03fa4551-240f-4e9f-a3ba-ca8918695d0e\") " pod="openshift-infra/auto-csr-approver-29533656-s2snx" Feb 25 11:36:00 crc kubenswrapper[4725]: I0225 11:36:00.489318 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533656-s2snx" Feb 25 11:36:01 crc kubenswrapper[4725]: I0225 11:36:01.047015 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533656-s2snx"] Feb 25 11:36:01 crc kubenswrapper[4725]: W0225 11:36:01.056925 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03fa4551_240f_4e9f_a3ba_ca8918695d0e.slice/crio-92497c0787a12e0e6ff17478dcb86e3415f95821d44ae2b8e151ca671c506050 WatchSource:0}: Error finding container 92497c0787a12e0e6ff17478dcb86e3415f95821d44ae2b8e151ca671c506050: Status 404 returned error can't find the container with id 92497c0787a12e0e6ff17478dcb86e3415f95821d44ae2b8e151ca671c506050 Feb 25 11:36:01 crc kubenswrapper[4725]: I0225 11:36:01.302618 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533656-s2snx" event={"ID":"03fa4551-240f-4e9f-a3ba-ca8918695d0e","Type":"ContainerStarted","Data":"92497c0787a12e0e6ff17478dcb86e3415f95821d44ae2b8e151ca671c506050"} Feb 25 11:36:02 crc kubenswrapper[4725]: I0225 11:36:02.316341 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533656-s2snx" event={"ID":"03fa4551-240f-4e9f-a3ba-ca8918695d0e","Type":"ContainerStarted","Data":"aff3b4b66c877b21570fdd97b46481ea9e3f970e38ed73b20fb49d13586fe5dc"} Feb 25 11:36:02 crc kubenswrapper[4725]: I0225 11:36:02.346189 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533656-s2snx" podStartSLOduration=1.492252166 podStartE2EDuration="2.346163332s" podCreationTimestamp="2026-02-25 11:36:00 +0000 UTC" firstStartedPulling="2026-02-25 11:36:01.060113034 +0000 UTC m=+2586.558695099" lastFinishedPulling="2026-02-25 11:36:01.91402421 +0000 UTC m=+2587.412606265" observedRunningTime="2026-02-25 11:36:02.334258354 +0000 UTC m=+2587.832840409" watchObservedRunningTime="2026-02-25 11:36:02.346163332 +0000 UTC m=+2587.844745387" Feb 25 11:36:03 crc kubenswrapper[4725]: I0225 11:36:03.330965 4725 generic.go:334] "Generic (PLEG): container finished" podID="03fa4551-240f-4e9f-a3ba-ca8918695d0e" containerID="aff3b4b66c877b21570fdd97b46481ea9e3f970e38ed73b20fb49d13586fe5dc" exitCode=0 Feb 25 11:36:03 crc kubenswrapper[4725]: I0225 11:36:03.331029 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533656-s2snx" event={"ID":"03fa4551-240f-4e9f-a3ba-ca8918695d0e","Type":"ContainerDied","Data":"aff3b4b66c877b21570fdd97b46481ea9e3f970e38ed73b20fb49d13586fe5dc"} Feb 25 11:36:04 crc kubenswrapper[4725]: I0225 11:36:04.757530 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533656-s2snx" Feb 25 11:36:04 crc kubenswrapper[4725]: I0225 11:36:04.919443 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68spx\" (UniqueName: \"kubernetes.io/projected/03fa4551-240f-4e9f-a3ba-ca8918695d0e-kube-api-access-68spx\") pod \"03fa4551-240f-4e9f-a3ba-ca8918695d0e\" (UID: \"03fa4551-240f-4e9f-a3ba-ca8918695d0e\") " Feb 25 11:36:04 crc kubenswrapper[4725]: I0225 11:36:04.929123 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fa4551-240f-4e9f-a3ba-ca8918695d0e-kube-api-access-68spx" (OuterVolumeSpecName: "kube-api-access-68spx") pod "03fa4551-240f-4e9f-a3ba-ca8918695d0e" (UID: "03fa4551-240f-4e9f-a3ba-ca8918695d0e"). InnerVolumeSpecName "kube-api-access-68spx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:36:05 crc kubenswrapper[4725]: I0225 11:36:05.022403 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68spx\" (UniqueName: \"kubernetes.io/projected/03fa4551-240f-4e9f-a3ba-ca8918695d0e-kube-api-access-68spx\") on node \"crc\" DevicePath \"\"" Feb 25 11:36:05 crc kubenswrapper[4725]: I0225 11:36:05.354941 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533656-s2snx" event={"ID":"03fa4551-240f-4e9f-a3ba-ca8918695d0e","Type":"ContainerDied","Data":"92497c0787a12e0e6ff17478dcb86e3415f95821d44ae2b8e151ca671c506050"} Feb 25 11:36:05 crc kubenswrapper[4725]: I0225 11:36:05.355003 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92497c0787a12e0e6ff17478dcb86e3415f95821d44ae2b8e151ca671c506050" Feb 25 11:36:05 crc kubenswrapper[4725]: I0225 11:36:05.355142 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533656-s2snx" Feb 25 11:36:05 crc kubenswrapper[4725]: I0225 11:36:05.442612 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533650-xfh7n"] Feb 25 11:36:05 crc kubenswrapper[4725]: I0225 11:36:05.453205 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533650-xfh7n"] Feb 25 11:36:07 crc kubenswrapper[4725]: I0225 11:36:07.264238 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c240bb9-703d-46d4-81b8-6f733dac4d9d" path="/var/lib/kubelet/pods/2c240bb9-703d-46d4-81b8-6f733dac4d9d/volumes" Feb 25 11:36:13 crc kubenswrapper[4725]: I0225 11:36:13.233730 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:36:13 crc kubenswrapper[4725]: E0225 11:36:13.236692 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:36:20 crc kubenswrapper[4725]: I0225 11:36:20.855499 4725 scope.go:117] "RemoveContainer" containerID="254dc42e26a8a333d26d1d3363566838ecbebf27979b9e32acb733e23b13c0f8" Feb 25 11:36:28 crc kubenswrapper[4725]: I0225 11:36:28.225513 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:36:28 crc kubenswrapper[4725]: E0225 11:36:28.227386 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:36:39 crc kubenswrapper[4725]: I0225 11:36:39.224386 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:36:39 crc kubenswrapper[4725]: E0225 11:36:39.225371 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:36:51 crc kubenswrapper[4725]: I0225 11:36:51.224779 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:36:51 crc kubenswrapper[4725]: I0225 11:36:51.841820 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"5f1352735f70d60c0184810ffaa1295427b1343d8d452f8dc314abfa9f82a71a"} Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.165626 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533658-btgxw"] Feb 25 11:38:00 crc kubenswrapper[4725]: E0225 11:38:00.167070 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fa4551-240f-4e9f-a3ba-ca8918695d0e" containerName="oc" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.167103 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fa4551-240f-4e9f-a3ba-ca8918695d0e" containerName="oc" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.167543 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fa4551-240f-4e9f-a3ba-ca8918695d0e" containerName="oc" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.168771 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533658-btgxw" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.171747 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.172373 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.174282 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.197606 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533658-btgxw"] Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.315617 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xstqs\" (UniqueName: \"kubernetes.io/projected/6cf1ac0d-b1ef-4cb2-9a01-993c807a2865-kube-api-access-xstqs\") pod \"auto-csr-approver-29533658-btgxw\" (UID: \"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865\") " pod="openshift-infra/auto-csr-approver-29533658-btgxw" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.418018 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xstqs\" (UniqueName: \"kubernetes.io/projected/6cf1ac0d-b1ef-4cb2-9a01-993c807a2865-kube-api-access-xstqs\") pod \"auto-csr-approver-29533658-btgxw\" (UID: \"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865\") " pod="openshift-infra/auto-csr-approver-29533658-btgxw" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.451069 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xstqs\" (UniqueName: \"kubernetes.io/projected/6cf1ac0d-b1ef-4cb2-9a01-993c807a2865-kube-api-access-xstqs\") pod \"auto-csr-approver-29533658-btgxw\" (UID: \"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865\") " pod="openshift-infra/auto-csr-approver-29533658-btgxw" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.496677 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533658-btgxw" Feb 25 11:38:00 crc kubenswrapper[4725]: I0225 11:38:00.975602 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533658-btgxw"] Feb 25 11:38:01 crc kubenswrapper[4725]: I0225 11:38:01.721928 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533658-btgxw" event={"ID":"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865","Type":"ContainerStarted","Data":"4729c4f7259ba82f294aaca317345fd816cbe821871d5e4d73f2be4ae806a3bf"} Feb 25 11:38:02 crc kubenswrapper[4725]: I0225 11:38:02.735005 4725 generic.go:334] "Generic (PLEG): container finished" podID="6cf1ac0d-b1ef-4cb2-9a01-993c807a2865" containerID="b962be7edc0bacd0479ba97885087a92c712ae8583d496e0d4431b166d979358" exitCode=0 Feb 25 11:38:02 crc kubenswrapper[4725]: I0225 11:38:02.735058 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533658-btgxw" event={"ID":"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865","Type":"ContainerDied","Data":"b962be7edc0bacd0479ba97885087a92c712ae8583d496e0d4431b166d979358"} Feb 25 11:38:04 crc kubenswrapper[4725]: I0225 11:38:04.067061 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533658-btgxw" Feb 25 11:38:04 crc kubenswrapper[4725]: I0225 11:38:04.104170 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xstqs\" (UniqueName: \"kubernetes.io/projected/6cf1ac0d-b1ef-4cb2-9a01-993c807a2865-kube-api-access-xstqs\") pod \"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865\" (UID: \"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865\") " Feb 25 11:38:04 crc kubenswrapper[4725]: I0225 11:38:04.143796 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf1ac0d-b1ef-4cb2-9a01-993c807a2865-kube-api-access-xstqs" (OuterVolumeSpecName: "kube-api-access-xstqs") pod "6cf1ac0d-b1ef-4cb2-9a01-993c807a2865" (UID: "6cf1ac0d-b1ef-4cb2-9a01-993c807a2865"). InnerVolumeSpecName "kube-api-access-xstqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:38:04 crc kubenswrapper[4725]: I0225 11:38:04.206500 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xstqs\" (UniqueName: \"kubernetes.io/projected/6cf1ac0d-b1ef-4cb2-9a01-993c807a2865-kube-api-access-xstqs\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:04 crc kubenswrapper[4725]: I0225 11:38:04.755395 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533658-btgxw" event={"ID":"6cf1ac0d-b1ef-4cb2-9a01-993c807a2865","Type":"ContainerDied","Data":"4729c4f7259ba82f294aaca317345fd816cbe821871d5e4d73f2be4ae806a3bf"} Feb 25 11:38:04 crc kubenswrapper[4725]: I0225 11:38:04.755443 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4729c4f7259ba82f294aaca317345fd816cbe821871d5e4d73f2be4ae806a3bf" Feb 25 11:38:04 crc kubenswrapper[4725]: I0225 11:38:04.755473 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533658-btgxw" Feb 25 11:38:05 crc kubenswrapper[4725]: I0225 11:38:05.171137 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533652-fpx7l"] Feb 25 11:38:05 crc kubenswrapper[4725]: I0225 11:38:05.177542 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533652-fpx7l"] Feb 25 11:38:05 crc kubenswrapper[4725]: I0225 11:38:05.246252 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be6b69e2-5209-4fc0-8622-2e1b9dae914f" path="/var/lib/kubelet/pods/be6b69e2-5209-4fc0-8622-2e1b9dae914f/volumes" Feb 25 11:38:21 crc kubenswrapper[4725]: I0225 11:38:21.004381 4725 scope.go:117] "RemoveContainer" containerID="167494c8facf58d7dada5918c139f23d49916f338211a74adb8e69380a93d07c" Feb 25 11:38:31 crc kubenswrapper[4725]: I0225 11:38:31.030158 4725 generic.go:334] "Generic (PLEG): container finished" podID="07f2c78c-f46d-4751-ae1b-ac502a378ff4" containerID="f4afd590c82b2f16636eb17ba0ace498cbd335d0573c87cba3a30344ff988969" exitCode=0 Feb 25 11:38:31 crc kubenswrapper[4725]: I0225 11:38:31.030195 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" event={"ID":"07f2c78c-f46d-4751-ae1b-ac502a378ff4","Type":"ContainerDied","Data":"f4afd590c82b2f16636eb17ba0ace498cbd335d0573c87cba3a30344ff988969"} Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.574246 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.618706 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-2\") pod \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.619189 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-inventory\") pod \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.619955 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdwzq\" (UniqueName: \"kubernetes.io/projected/07f2c78c-f46d-4751-ae1b-ac502a378ff4-kube-api-access-cdwzq\") pod \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.620110 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-telemetry-combined-ca-bundle\") pod \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.620195 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-0\") pod \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.620276 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-1\") pod \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.620374 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ssh-key-openstack-edpm-ipam\") pod \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\" (UID: \"07f2c78c-f46d-4751-ae1b-ac502a378ff4\") " Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.632897 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f2c78c-f46d-4751-ae1b-ac502a378ff4-kube-api-access-cdwzq" (OuterVolumeSpecName: "kube-api-access-cdwzq") pod "07f2c78c-f46d-4751-ae1b-ac502a378ff4" (UID: "07f2c78c-f46d-4751-ae1b-ac502a378ff4"). InnerVolumeSpecName "kube-api-access-cdwzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.633410 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "07f2c78c-f46d-4751-ae1b-ac502a378ff4" (UID: "07f2c78c-f46d-4751-ae1b-ac502a378ff4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.647477 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "07f2c78c-f46d-4751-ae1b-ac502a378ff4" (UID: "07f2c78c-f46d-4751-ae1b-ac502a378ff4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.652765 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "07f2c78c-f46d-4751-ae1b-ac502a378ff4" (UID: "07f2c78c-f46d-4751-ae1b-ac502a378ff4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.656188 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "07f2c78c-f46d-4751-ae1b-ac502a378ff4" (UID: "07f2c78c-f46d-4751-ae1b-ac502a378ff4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.658007 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "07f2c78c-f46d-4751-ae1b-ac502a378ff4" (UID: "07f2c78c-f46d-4751-ae1b-ac502a378ff4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.660863 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-inventory" (OuterVolumeSpecName: "inventory") pod "07f2c78c-f46d-4751-ae1b-ac502a378ff4" (UID: "07f2c78c-f46d-4751-ae1b-ac502a378ff4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.722215 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.722342 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.722410 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.722463 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.722517 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdwzq\" (UniqueName: \"kubernetes.io/projected/07f2c78c-f46d-4751-ae1b-ac502a378ff4-kube-api-access-cdwzq\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.722577 4725 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:32 crc kubenswrapper[4725]: I0225 11:38:32.722632 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/07f2c78c-f46d-4751-ae1b-ac502a378ff4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 25 11:38:33 crc kubenswrapper[4725]: I0225 11:38:33.053807 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" event={"ID":"07f2c78c-f46d-4751-ae1b-ac502a378ff4","Type":"ContainerDied","Data":"f2b4a8f59366e013dc6ab63e747d1d1f34ea28fdd968418d1d81338f8486e1ad"} Feb 25 11:38:33 crc kubenswrapper[4725]: I0225 11:38:33.053911 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2b4a8f59366e013dc6ab63e747d1d1f34ea28fdd968418d1d81338f8486e1ad" Feb 25 11:38:33 crc kubenswrapper[4725]: I0225 11:38:33.053949 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dg75m" Feb 25 11:39:11 crc kubenswrapper[4725]: I0225 11:39:11.555290 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:39:11 crc kubenswrapper[4725]: I0225 11:39:11.555908 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.658383 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 11:39:14 crc kubenswrapper[4725]: E0225 11:39:14.660639 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf1ac0d-b1ef-4cb2-9a01-993c807a2865" containerName="oc" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.660786 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf1ac0d-b1ef-4cb2-9a01-993c807a2865" containerName="oc" Feb 25 11:39:14 crc kubenswrapper[4725]: E0225 11:39:14.660936 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f2c78c-f46d-4751-ae1b-ac502a378ff4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.661045 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f2c78c-f46d-4751-ae1b-ac502a378ff4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.661440 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f2c78c-f46d-4751-ae1b-ac502a378ff4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.661584 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf1ac0d-b1ef-4cb2-9a01-993c807a2865" containerName="oc" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.662630 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.668501 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.668931 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.669009 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.669608 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-4svv6" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.671745 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711275 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711328 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711360 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-config-data\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711405 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711431 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711462 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711478 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711512 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.711552 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mbfb\" (UniqueName: \"kubernetes.io/projected/07081f50-997d-4877-be58-a446955dfe62-kube-api-access-9mbfb\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.812950 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813036 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-config-data\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813087 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813114 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813145 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813162 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.813241 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mbfb\" (UniqueName: \"kubernetes.io/projected/07081f50-997d-4877-be58-a446955dfe62-kube-api-access-9mbfb\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.814492 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.814558 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.814949 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.815062 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-config-data\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.815562 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.820714 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.821708 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.823431 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.841875 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mbfb\" (UniqueName: \"kubernetes.io/projected/07081f50-997d-4877-be58-a446955dfe62-kube-api-access-9mbfb\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:14 crc kubenswrapper[4725]: I0225 11:39:14.853640 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " pod="openstack/tempest-tests-tempest" Feb 25 11:39:15 crc kubenswrapper[4725]: I0225 11:39:15.025198 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 11:39:15 crc kubenswrapper[4725]: I0225 11:39:15.531415 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:39:15 crc kubenswrapper[4725]: I0225 11:39:15.531577 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 11:39:16 crc kubenswrapper[4725]: I0225 11:39:16.538918 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07081f50-997d-4877-be58-a446955dfe62","Type":"ContainerStarted","Data":"c4a69b690ae90015e395bc3abccc03b9614cf2aa9c0be63c6b36e0e49993eef8"} Feb 25 11:39:41 crc kubenswrapper[4725]: I0225 11:39:41.555908 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:39:41 crc kubenswrapper[4725]: I0225 11:39:41.556525 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:39:48 crc kubenswrapper[4725]: E0225 11:39:48.302030 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 25 11:39:48 crc kubenswrapper[4725]: E0225 11:39:48.302639 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mbfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(07081f50-997d-4877-be58-a446955dfe62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 11:39:48 crc kubenswrapper[4725]: E0225 11:39:48.303923 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="07081f50-997d-4877-be58-a446955dfe62" Feb 25 11:39:48 crc kubenswrapper[4725]: E0225 11:39:48.870751 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="07081f50-997d-4877-be58-a446955dfe62" Feb 25 11:39:59 crc kubenswrapper[4725]: I0225 11:39:59.711932 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.167604 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533660-2kc5z"] Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.169594 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533660-2kc5z" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.172613 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.172815 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.174503 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.195215 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533660-2kc5z"] Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.280860 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k84zp\" (UniqueName: \"kubernetes.io/projected/37abe97b-f606-4716-8dd7-957b23922f42-kube-api-access-k84zp\") pod \"auto-csr-approver-29533660-2kc5z\" (UID: \"37abe97b-f606-4716-8dd7-957b23922f42\") " pod="openshift-infra/auto-csr-approver-29533660-2kc5z" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.385174 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k84zp\" (UniqueName: \"kubernetes.io/projected/37abe97b-f606-4716-8dd7-957b23922f42-kube-api-access-k84zp\") pod \"auto-csr-approver-29533660-2kc5z\" (UID: \"37abe97b-f606-4716-8dd7-957b23922f42\") " pod="openshift-infra/auto-csr-approver-29533660-2kc5z" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.411408 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k84zp\" (UniqueName: \"kubernetes.io/projected/37abe97b-f606-4716-8dd7-957b23922f42-kube-api-access-k84zp\") pod \"auto-csr-approver-29533660-2kc5z\" (UID: \"37abe97b-f606-4716-8dd7-957b23922f42\") " pod="openshift-infra/auto-csr-approver-29533660-2kc5z" Feb 25 11:40:00 crc kubenswrapper[4725]: I0225 11:40:00.512675 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533660-2kc5z" Feb 25 11:40:01 crc kubenswrapper[4725]: W0225 11:40:01.016272 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37abe97b_f606_4716_8dd7_957b23922f42.slice/crio-0c9e68a404f1f7ead85590729e5b857221fd32d167c9cf9f3c766e114e9c6c5d WatchSource:0}: Error finding container 0c9e68a404f1f7ead85590729e5b857221fd32d167c9cf9f3c766e114e9c6c5d: Status 404 returned error can't find the container with id 0c9e68a404f1f7ead85590729e5b857221fd32d167c9cf9f3c766e114e9c6c5d Feb 25 11:40:01 crc kubenswrapper[4725]: I0225 11:40:01.038613 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533660-2kc5z"] Feb 25 11:40:02 crc kubenswrapper[4725]: I0225 11:40:02.025890 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07081f50-997d-4877-be58-a446955dfe62","Type":"ContainerStarted","Data":"fdd421c9f0b75001233241cc39e85b7819903b3d0063b1edd7058faf647b5e2d"} Feb 25 11:40:02 crc kubenswrapper[4725]: I0225 11:40:02.031984 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533660-2kc5z" event={"ID":"37abe97b-f606-4716-8dd7-957b23922f42","Type":"ContainerStarted","Data":"0c9e68a404f1f7ead85590729e5b857221fd32d167c9cf9f3c766e114e9c6c5d"} Feb 25 11:40:02 crc kubenswrapper[4725]: I0225 11:40:02.066504 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.888192286 podStartE2EDuration="49.066472954s" podCreationTimestamp="2026-02-25 11:39:13 +0000 UTC" firstStartedPulling="2026-02-25 11:39:15.531080566 +0000 UTC m=+2781.029662601" lastFinishedPulling="2026-02-25 11:39:59.709361234 +0000 UTC m=+2825.207943269" observedRunningTime="2026-02-25 11:40:02.057953557 +0000 UTC m=+2827.556535662" watchObservedRunningTime="2026-02-25 11:40:02.066472954 +0000 UTC m=+2827.565055019" Feb 25 11:40:03 crc kubenswrapper[4725]: I0225 11:40:03.047445 4725 generic.go:334] "Generic (PLEG): container finished" podID="37abe97b-f606-4716-8dd7-957b23922f42" containerID="449ae2c78d7d9e832975ddc4432326b75d9aae940519a52a8b6e23236fac8c88" exitCode=0 Feb 25 11:40:03 crc kubenswrapper[4725]: I0225 11:40:03.047759 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533660-2kc5z" event={"ID":"37abe97b-f606-4716-8dd7-957b23922f42","Type":"ContainerDied","Data":"449ae2c78d7d9e832975ddc4432326b75d9aae940519a52a8b6e23236fac8c88"} Feb 25 11:40:04 crc kubenswrapper[4725]: I0225 11:40:04.588729 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533660-2kc5z" Feb 25 11:40:04 crc kubenswrapper[4725]: I0225 11:40:04.675885 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k84zp\" (UniqueName: \"kubernetes.io/projected/37abe97b-f606-4716-8dd7-957b23922f42-kube-api-access-k84zp\") pod \"37abe97b-f606-4716-8dd7-957b23922f42\" (UID: \"37abe97b-f606-4716-8dd7-957b23922f42\") " Feb 25 11:40:04 crc kubenswrapper[4725]: I0225 11:40:04.688680 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37abe97b-f606-4716-8dd7-957b23922f42-kube-api-access-k84zp" (OuterVolumeSpecName: "kube-api-access-k84zp") pod "37abe97b-f606-4716-8dd7-957b23922f42" (UID: "37abe97b-f606-4716-8dd7-957b23922f42"). InnerVolumeSpecName "kube-api-access-k84zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:40:04 crc kubenswrapper[4725]: I0225 11:40:04.779937 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k84zp\" (UniqueName: \"kubernetes.io/projected/37abe97b-f606-4716-8dd7-957b23922f42-kube-api-access-k84zp\") on node \"crc\" DevicePath \"\"" Feb 25 11:40:05 crc kubenswrapper[4725]: I0225 11:40:05.078138 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533660-2kc5z" event={"ID":"37abe97b-f606-4716-8dd7-957b23922f42","Type":"ContainerDied","Data":"0c9e68a404f1f7ead85590729e5b857221fd32d167c9cf9f3c766e114e9c6c5d"} Feb 25 11:40:05 crc kubenswrapper[4725]: I0225 11:40:05.078472 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c9e68a404f1f7ead85590729e5b857221fd32d167c9cf9f3c766e114e9c6c5d" Feb 25 11:40:05 crc kubenswrapper[4725]: I0225 11:40:05.078226 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533660-2kc5z" Feb 25 11:40:05 crc kubenswrapper[4725]: I0225 11:40:05.677293 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533654-d89m7"] Feb 25 11:40:05 crc kubenswrapper[4725]: I0225 11:40:05.693431 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533654-d89m7"] Feb 25 11:40:07 crc kubenswrapper[4725]: I0225 11:40:07.240981 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa77d59-616e-430f-b790-a0fcbfc53b73" path="/var/lib/kubelet/pods/efa77d59-616e-430f-b790-a0fcbfc53b73/volumes" Feb 25 11:40:11 crc kubenswrapper[4725]: I0225 11:40:11.555634 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:40:11 crc kubenswrapper[4725]: I0225 11:40:11.556320 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:40:11 crc kubenswrapper[4725]: I0225 11:40:11.556380 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:40:11 crc kubenswrapper[4725]: I0225 11:40:11.557419 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f1352735f70d60c0184810ffaa1295427b1343d8d452f8dc314abfa9f82a71a"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:40:11 crc kubenswrapper[4725]: I0225 11:40:11.557530 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://5f1352735f70d60c0184810ffaa1295427b1343d8d452f8dc314abfa9f82a71a" gracePeriod=600 Feb 25 11:40:12 crc kubenswrapper[4725]: I0225 11:40:12.164719 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="5f1352735f70d60c0184810ffaa1295427b1343d8d452f8dc314abfa9f82a71a" exitCode=0 Feb 25 11:40:12 crc kubenswrapper[4725]: I0225 11:40:12.164790 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"5f1352735f70d60c0184810ffaa1295427b1343d8d452f8dc314abfa9f82a71a"} Feb 25 11:40:12 crc kubenswrapper[4725]: I0225 11:40:12.165091 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797"} Feb 25 11:40:12 crc kubenswrapper[4725]: I0225 11:40:12.165115 4725 scope.go:117] "RemoveContainer" containerID="2a70b2660b7eaf60ff10fcbd280f45dbd8fb05881aacb55a6b47cb601c4af378" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.682141 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s7p5z"] Feb 25 11:40:19 crc kubenswrapper[4725]: E0225 11:40:19.683195 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37abe97b-f606-4716-8dd7-957b23922f42" containerName="oc" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.683211 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="37abe97b-f606-4716-8dd7-957b23922f42" containerName="oc" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.683478 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="37abe97b-f606-4716-8dd7-957b23922f42" containerName="oc" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.685123 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.694895 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7p5z"] Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.809913 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprpv\" (UniqueName: \"kubernetes.io/projected/024add56-921d-4c61-83fa-160e5ec2dba1-kube-api-access-qprpv\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.810028 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-catalog-content\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.810157 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-utilities\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.912121 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-utilities\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.912272 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qprpv\" (UniqueName: \"kubernetes.io/projected/024add56-921d-4c61-83fa-160e5ec2dba1-kube-api-access-qprpv\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.912392 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-catalog-content\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.912977 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-utilities\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.913138 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-catalog-content\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:19 crc kubenswrapper[4725]: I0225 11:40:19.934277 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprpv\" (UniqueName: \"kubernetes.io/projected/024add56-921d-4c61-83fa-160e5ec2dba1-kube-api-access-qprpv\") pod \"community-operators-s7p5z\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:20 crc kubenswrapper[4725]: I0225 11:40:20.044714 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:20 crc kubenswrapper[4725]: I0225 11:40:20.560542 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7p5z"] Feb 25 11:40:21 crc kubenswrapper[4725]: I0225 11:40:21.135914 4725 scope.go:117] "RemoveContainer" containerID="9120a675830f20e7616c6e03e5ad6ccc09d11df2709df7aa36ab8b18ecb7ff55" Feb 25 11:40:21 crc kubenswrapper[4725]: I0225 11:40:21.293808 4725 generic.go:334] "Generic (PLEG): container finished" podID="024add56-921d-4c61-83fa-160e5ec2dba1" containerID="826cd21fb35eefe9f42192cd0e127cfc73fb8654e3a78d2c5baff6bd2bc5a416" exitCode=0 Feb 25 11:40:21 crc kubenswrapper[4725]: I0225 11:40:21.293898 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7p5z" event={"ID":"024add56-921d-4c61-83fa-160e5ec2dba1","Type":"ContainerDied","Data":"826cd21fb35eefe9f42192cd0e127cfc73fb8654e3a78d2c5baff6bd2bc5a416"} Feb 25 11:40:21 crc kubenswrapper[4725]: I0225 11:40:21.293957 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7p5z" event={"ID":"024add56-921d-4c61-83fa-160e5ec2dba1","Type":"ContainerStarted","Data":"e4246aa01185145eda470005b567388821e8552bf7f48c0c8e5818307f04fa1e"} Feb 25 11:40:22 crc kubenswrapper[4725]: I0225 11:40:22.309481 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7p5z" event={"ID":"024add56-921d-4c61-83fa-160e5ec2dba1","Type":"ContainerStarted","Data":"ac7a56df3b90225baaf78a8f4c8b860bf916e578775afb7a17093b1131c983ce"} Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.286266 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fzgbd"] Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.291817 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.302103 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzgbd"] Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.332509 4725 generic.go:334] "Generic (PLEG): container finished" podID="024add56-921d-4c61-83fa-160e5ec2dba1" containerID="ac7a56df3b90225baaf78a8f4c8b860bf916e578775afb7a17093b1131c983ce" exitCode=0 Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.332562 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7p5z" event={"ID":"024add56-921d-4c61-83fa-160e5ec2dba1","Type":"ContainerDied","Data":"ac7a56df3b90225baaf78a8f4c8b860bf916e578775afb7a17093b1131c983ce"} Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.408965 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-catalog-content\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.411265 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnd8w\" (UniqueName: \"kubernetes.io/projected/525a83a6-912a-4964-82d9-2fe912d13669-kube-api-access-hnd8w\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.411588 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-utilities\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.516308 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-catalog-content\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.516876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnd8w\" (UniqueName: \"kubernetes.io/projected/525a83a6-912a-4964-82d9-2fe912d13669-kube-api-access-hnd8w\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.516952 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-utilities\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.517552 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-catalog-content\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.517629 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-utilities\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.550130 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnd8w\" (UniqueName: \"kubernetes.io/projected/525a83a6-912a-4964-82d9-2fe912d13669-kube-api-access-hnd8w\") pod \"redhat-marketplace-fzgbd\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:23 crc kubenswrapper[4725]: I0225 11:40:23.619804 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:24 crc kubenswrapper[4725]: I0225 11:40:24.064196 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzgbd"] Feb 25 11:40:24 crc kubenswrapper[4725]: W0225 11:40:24.071786 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525a83a6_912a_4964_82d9_2fe912d13669.slice/crio-d7d7dc412fb29427129cc135bedc120cda88c338d96d1ab34fe85f246028d6f7 WatchSource:0}: Error finding container d7d7dc412fb29427129cc135bedc120cda88c338d96d1ab34fe85f246028d6f7: Status 404 returned error can't find the container with id d7d7dc412fb29427129cc135bedc120cda88c338d96d1ab34fe85f246028d6f7 Feb 25 11:40:24 crc kubenswrapper[4725]: I0225 11:40:24.344304 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzgbd" event={"ID":"525a83a6-912a-4964-82d9-2fe912d13669","Type":"ContainerDied","Data":"2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a"} Feb 25 11:40:24 crc kubenswrapper[4725]: I0225 11:40:24.344719 4725 generic.go:334] "Generic (PLEG): container finished" podID="525a83a6-912a-4964-82d9-2fe912d13669" containerID="2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a" exitCode=0 Feb 25 11:40:24 crc kubenswrapper[4725]: I0225 11:40:24.344861 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzgbd" event={"ID":"525a83a6-912a-4964-82d9-2fe912d13669","Type":"ContainerStarted","Data":"d7d7dc412fb29427129cc135bedc120cda88c338d96d1ab34fe85f246028d6f7"} Feb 25 11:40:24 crc kubenswrapper[4725]: I0225 11:40:24.350128 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7p5z" event={"ID":"024add56-921d-4c61-83fa-160e5ec2dba1","Type":"ContainerStarted","Data":"f3962aab69c4d6d788f0b25e8182b7cabb64eee2193ca0f4e4dcc922e94effcc"} Feb 25 11:40:24 crc kubenswrapper[4725]: I0225 11:40:24.387052 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s7p5z" podStartSLOduration=2.7269191470000003 podStartE2EDuration="5.387032038s" podCreationTimestamp="2026-02-25 11:40:19 +0000 UTC" firstStartedPulling="2026-02-25 11:40:21.297865417 +0000 UTC m=+2846.796447472" lastFinishedPulling="2026-02-25 11:40:23.957978338 +0000 UTC m=+2849.456560363" observedRunningTime="2026-02-25 11:40:24.383635797 +0000 UTC m=+2849.882217832" watchObservedRunningTime="2026-02-25 11:40:24.387032038 +0000 UTC m=+2849.885614073" Feb 25 11:40:26 crc kubenswrapper[4725]: I0225 11:40:26.373125 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzgbd" event={"ID":"525a83a6-912a-4964-82d9-2fe912d13669","Type":"ContainerDied","Data":"4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8"} Feb 25 11:40:26 crc kubenswrapper[4725]: I0225 11:40:26.372945 4725 generic.go:334] "Generic (PLEG): container finished" podID="525a83a6-912a-4964-82d9-2fe912d13669" containerID="4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8" exitCode=0 Feb 25 11:40:27 crc kubenswrapper[4725]: I0225 11:40:27.386416 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzgbd" event={"ID":"525a83a6-912a-4964-82d9-2fe912d13669","Type":"ContainerStarted","Data":"b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f"} Feb 25 11:40:27 crc kubenswrapper[4725]: I0225 11:40:27.416976 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fzgbd" podStartSLOduration=1.9832615900000001 podStartE2EDuration="4.41696108s" podCreationTimestamp="2026-02-25 11:40:23 +0000 UTC" firstStartedPulling="2026-02-25 11:40:24.344819913 +0000 UTC m=+2849.843401948" lastFinishedPulling="2026-02-25 11:40:26.778519403 +0000 UTC m=+2852.277101438" observedRunningTime="2026-02-25 11:40:27.412376078 +0000 UTC m=+2852.910958163" watchObservedRunningTime="2026-02-25 11:40:27.41696108 +0000 UTC m=+2852.915543105" Feb 25 11:40:30 crc kubenswrapper[4725]: I0225 11:40:30.045323 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:30 crc kubenswrapper[4725]: I0225 11:40:30.046053 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:30 crc kubenswrapper[4725]: I0225 11:40:30.091000 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:30 crc kubenswrapper[4725]: I0225 11:40:30.481679 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.272254 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7p5z"] Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.274311 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s7p5z" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="registry-server" containerID="cri-o://f3962aab69c4d6d788f0b25e8182b7cabb64eee2193ca0f4e4dcc922e94effcc" gracePeriod=2 Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.456411 4725 generic.go:334] "Generic (PLEG): container finished" podID="024add56-921d-4c61-83fa-160e5ec2dba1" containerID="f3962aab69c4d6d788f0b25e8182b7cabb64eee2193ca0f4e4dcc922e94effcc" exitCode=0 Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.456469 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7p5z" event={"ID":"024add56-921d-4c61-83fa-160e5ec2dba1","Type":"ContainerDied","Data":"f3962aab69c4d6d788f0b25e8182b7cabb64eee2193ca0f4e4dcc922e94effcc"} Feb 25 11:40:33 crc kubenswrapper[4725]: E0225 11:40:33.508010 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024add56_921d_4c61_83fa_160e5ec2dba1.slice/crio-conmon-f3962aab69c4d6d788f0b25e8182b7cabb64eee2193ca0f4e4dcc922e94effcc.scope\": RecentStats: unable to find data in memory cache]" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.623475 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.623973 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.685508 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.819984 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.942588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-utilities\") pod \"024add56-921d-4c61-83fa-160e5ec2dba1\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.942672 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-catalog-content\") pod \"024add56-921d-4c61-83fa-160e5ec2dba1\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.942804 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qprpv\" (UniqueName: \"kubernetes.io/projected/024add56-921d-4c61-83fa-160e5ec2dba1-kube-api-access-qprpv\") pod \"024add56-921d-4c61-83fa-160e5ec2dba1\" (UID: \"024add56-921d-4c61-83fa-160e5ec2dba1\") " Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.943222 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-utilities" (OuterVolumeSpecName: "utilities") pod "024add56-921d-4c61-83fa-160e5ec2dba1" (UID: "024add56-921d-4c61-83fa-160e5ec2dba1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.943437 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:40:33 crc kubenswrapper[4725]: I0225 11:40:33.950226 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024add56-921d-4c61-83fa-160e5ec2dba1-kube-api-access-qprpv" (OuterVolumeSpecName: "kube-api-access-qprpv") pod "024add56-921d-4c61-83fa-160e5ec2dba1" (UID: "024add56-921d-4c61-83fa-160e5ec2dba1"). InnerVolumeSpecName "kube-api-access-qprpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.000369 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "024add56-921d-4c61-83fa-160e5ec2dba1" (UID: "024add56-921d-4c61-83fa-160e5ec2dba1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.044871 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/024add56-921d-4c61-83fa-160e5ec2dba1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.044901 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qprpv\" (UniqueName: \"kubernetes.io/projected/024add56-921d-4c61-83fa-160e5ec2dba1-kube-api-access-qprpv\") on node \"crc\" DevicePath \"\"" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.468310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7p5z" event={"ID":"024add56-921d-4c61-83fa-160e5ec2dba1","Type":"ContainerDied","Data":"e4246aa01185145eda470005b567388821e8552bf7f48c0c8e5818307f04fa1e"} Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.468368 4725 scope.go:117] "RemoveContainer" containerID="f3962aab69c4d6d788f0b25e8182b7cabb64eee2193ca0f4e4dcc922e94effcc" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.468369 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7p5z" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.491390 4725 scope.go:117] "RemoveContainer" containerID="ac7a56df3b90225baaf78a8f4c8b860bf916e578775afb7a17093b1131c983ce" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.525182 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7p5z"] Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.531569 4725 scope.go:117] "RemoveContainer" containerID="826cd21fb35eefe9f42192cd0e127cfc73fb8654e3a78d2c5baff6bd2bc5a416" Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.537724 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s7p5z"] Feb 25 11:40:34 crc kubenswrapper[4725]: I0225 11:40:34.539888 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:35 crc kubenswrapper[4725]: I0225 11:40:35.236842 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" path="/var/lib/kubelet/pods/024add56-921d-4c61-83fa-160e5ec2dba1/volumes" Feb 25 11:40:36 crc kubenswrapper[4725]: I0225 11:40:36.072173 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzgbd"] Feb 25 11:40:36 crc kubenswrapper[4725]: I0225 11:40:36.488214 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fzgbd" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="registry-server" containerID="cri-o://b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f" gracePeriod=2 Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.112076 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.214822 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-catalog-content\") pod \"525a83a6-912a-4964-82d9-2fe912d13669\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.214993 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-utilities\") pod \"525a83a6-912a-4964-82d9-2fe912d13669\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.215145 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnd8w\" (UniqueName: \"kubernetes.io/projected/525a83a6-912a-4964-82d9-2fe912d13669-kube-api-access-hnd8w\") pod \"525a83a6-912a-4964-82d9-2fe912d13669\" (UID: \"525a83a6-912a-4964-82d9-2fe912d13669\") " Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.216626 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-utilities" (OuterVolumeSpecName: "utilities") pod "525a83a6-912a-4964-82d9-2fe912d13669" (UID: "525a83a6-912a-4964-82d9-2fe912d13669"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.230570 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525a83a6-912a-4964-82d9-2fe912d13669-kube-api-access-hnd8w" (OuterVolumeSpecName: "kube-api-access-hnd8w") pod "525a83a6-912a-4964-82d9-2fe912d13669" (UID: "525a83a6-912a-4964-82d9-2fe912d13669"). InnerVolumeSpecName "kube-api-access-hnd8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.247344 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "525a83a6-912a-4964-82d9-2fe912d13669" (UID: "525a83a6-912a-4964-82d9-2fe912d13669"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.317531 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnd8w\" (UniqueName: \"kubernetes.io/projected/525a83a6-912a-4964-82d9-2fe912d13669-kube-api-access-hnd8w\") on node \"crc\" DevicePath \"\"" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.317580 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.317599 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525a83a6-912a-4964-82d9-2fe912d13669-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.499516 4725 generic.go:334] "Generic (PLEG): container finished" podID="525a83a6-912a-4964-82d9-2fe912d13669" containerID="b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f" exitCode=0 Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.499564 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzgbd" event={"ID":"525a83a6-912a-4964-82d9-2fe912d13669","Type":"ContainerDied","Data":"b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f"} Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.499598 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzgbd" event={"ID":"525a83a6-912a-4964-82d9-2fe912d13669","Type":"ContainerDied","Data":"d7d7dc412fb29427129cc135bedc120cda88c338d96d1ab34fe85f246028d6f7"} Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.499618 4725 scope.go:117] "RemoveContainer" containerID="b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.500022 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzgbd" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.525525 4725 scope.go:117] "RemoveContainer" containerID="4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.548622 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzgbd"] Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.559975 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzgbd"] Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.584866 4725 scope.go:117] "RemoveContainer" containerID="2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.613218 4725 scope.go:117] "RemoveContainer" containerID="b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f" Feb 25 11:40:37 crc kubenswrapper[4725]: E0225 11:40:37.613667 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f\": container with ID starting with b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f not found: ID does not exist" containerID="b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.613706 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f"} err="failed to get container status \"b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f\": rpc error: code = NotFound desc = could not find container \"b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f\": container with ID starting with b4b7cdef636bcd6ecdcebead5afb38a4493805f3235afb7aa1edf6bf9d92a90f not found: ID does not exist" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.613732 4725 scope.go:117] "RemoveContainer" containerID="4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8" Feb 25 11:40:37 crc kubenswrapper[4725]: E0225 11:40:37.614225 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8\": container with ID starting with 4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8 not found: ID does not exist" containerID="4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.614266 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8"} err="failed to get container status \"4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8\": rpc error: code = NotFound desc = could not find container \"4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8\": container with ID starting with 4d4bec4bffa613a27587570975984bac151c7a7d32e9533dceb100a55c884cf8 not found: ID does not exist" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.614294 4725 scope.go:117] "RemoveContainer" containerID="2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a" Feb 25 11:40:37 crc kubenswrapper[4725]: E0225 11:40:37.614665 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a\": container with ID starting with 2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a not found: ID does not exist" containerID="2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a" Feb 25 11:40:37 crc kubenswrapper[4725]: I0225 11:40:37.614691 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a"} err="failed to get container status \"2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a\": rpc error: code = NotFound desc = could not find container \"2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a\": container with ID starting with 2835e6ead2237e83b535dcd8947d46bb916e89c25a2bcb3a0885dab15b8b5a1a not found: ID does not exist" Feb 25 11:40:39 crc kubenswrapper[4725]: I0225 11:40:39.235056 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525a83a6-912a-4964-82d9-2fe912d13669" path="/var/lib/kubelet/pods/525a83a6-912a-4964-82d9-2fe912d13669/volumes" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.150399 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533662-s664f"] Feb 25 11:42:00 crc kubenswrapper[4725]: E0225 11:42:00.151414 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="extract-utilities" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151430 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="extract-utilities" Feb 25 11:42:00 crc kubenswrapper[4725]: E0225 11:42:00.151461 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="extract-utilities" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151467 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="extract-utilities" Feb 25 11:42:00 crc kubenswrapper[4725]: E0225 11:42:00.151475 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="registry-server" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151481 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="registry-server" Feb 25 11:42:00 crc kubenswrapper[4725]: E0225 11:42:00.151495 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="extract-content" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151501 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="extract-content" Feb 25 11:42:00 crc kubenswrapper[4725]: E0225 11:42:00.151514 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="extract-content" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151519 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="extract-content" Feb 25 11:42:00 crc kubenswrapper[4725]: E0225 11:42:00.151531 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="registry-server" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151537 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="registry-server" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151709 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="525a83a6-912a-4964-82d9-2fe912d13669" containerName="registry-server" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.151726 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="024add56-921d-4c61-83fa-160e5ec2dba1" containerName="registry-server" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.152387 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533662-s664f" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.154450 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.154892 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.155648 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.162464 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533662-s664f"] Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.260637 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ffv\" (UniqueName: \"kubernetes.io/projected/91babeeb-e052-4777-bae5-6e19023d92df-kube-api-access-b9ffv\") pod \"auto-csr-approver-29533662-s664f\" (UID: \"91babeeb-e052-4777-bae5-6e19023d92df\") " pod="openshift-infra/auto-csr-approver-29533662-s664f" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.361952 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ffv\" (UniqueName: \"kubernetes.io/projected/91babeeb-e052-4777-bae5-6e19023d92df-kube-api-access-b9ffv\") pod \"auto-csr-approver-29533662-s664f\" (UID: \"91babeeb-e052-4777-bae5-6e19023d92df\") " pod="openshift-infra/auto-csr-approver-29533662-s664f" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.391877 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ffv\" (UniqueName: \"kubernetes.io/projected/91babeeb-e052-4777-bae5-6e19023d92df-kube-api-access-b9ffv\") pod \"auto-csr-approver-29533662-s664f\" (UID: \"91babeeb-e052-4777-bae5-6e19023d92df\") " pod="openshift-infra/auto-csr-approver-29533662-s664f" Feb 25 11:42:00 crc kubenswrapper[4725]: I0225 11:42:00.473454 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533662-s664f" Feb 25 11:42:01 crc kubenswrapper[4725]: I0225 11:42:01.006387 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533662-s664f"] Feb 25 11:42:01 crc kubenswrapper[4725]: W0225 11:42:01.010382 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91babeeb_e052_4777_bae5_6e19023d92df.slice/crio-38381cb6235599a0dff2ac72f2330931120089365ecd08274a0b7f5db0b90690 WatchSource:0}: Error finding container 38381cb6235599a0dff2ac72f2330931120089365ecd08274a0b7f5db0b90690: Status 404 returned error can't find the container with id 38381cb6235599a0dff2ac72f2330931120089365ecd08274a0b7f5db0b90690 Feb 25 11:42:01 crc kubenswrapper[4725]: I0225 11:42:01.467014 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533662-s664f" event={"ID":"91babeeb-e052-4777-bae5-6e19023d92df","Type":"ContainerStarted","Data":"38381cb6235599a0dff2ac72f2330931120089365ecd08274a0b7f5db0b90690"} Feb 25 11:42:03 crc kubenswrapper[4725]: I0225 11:42:03.492482 4725 generic.go:334] "Generic (PLEG): container finished" podID="91babeeb-e052-4777-bae5-6e19023d92df" containerID="441aa5a172dbc380697f94f1ed92f72a71085d2ee92200935117e576f6b931e6" exitCode=0 Feb 25 11:42:03 crc kubenswrapper[4725]: I0225 11:42:03.492553 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533662-s664f" event={"ID":"91babeeb-e052-4777-bae5-6e19023d92df","Type":"ContainerDied","Data":"441aa5a172dbc380697f94f1ed92f72a71085d2ee92200935117e576f6b931e6"} Feb 25 11:42:04 crc kubenswrapper[4725]: I0225 11:42:04.987247 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533662-s664f" Feb 25 11:42:05 crc kubenswrapper[4725]: I0225 11:42:05.065065 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ffv\" (UniqueName: \"kubernetes.io/projected/91babeeb-e052-4777-bae5-6e19023d92df-kube-api-access-b9ffv\") pod \"91babeeb-e052-4777-bae5-6e19023d92df\" (UID: \"91babeeb-e052-4777-bae5-6e19023d92df\") " Feb 25 11:42:05 crc kubenswrapper[4725]: I0225 11:42:05.073023 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91babeeb-e052-4777-bae5-6e19023d92df-kube-api-access-b9ffv" (OuterVolumeSpecName: "kube-api-access-b9ffv") pod "91babeeb-e052-4777-bae5-6e19023d92df" (UID: "91babeeb-e052-4777-bae5-6e19023d92df"). InnerVolumeSpecName "kube-api-access-b9ffv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:42:05 crc kubenswrapper[4725]: I0225 11:42:05.168510 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ffv\" (UniqueName: \"kubernetes.io/projected/91babeeb-e052-4777-bae5-6e19023d92df-kube-api-access-b9ffv\") on node \"crc\" DevicePath \"\"" Feb 25 11:42:05 crc kubenswrapper[4725]: I0225 11:42:05.527090 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533662-s664f" event={"ID":"91babeeb-e052-4777-bae5-6e19023d92df","Type":"ContainerDied","Data":"38381cb6235599a0dff2ac72f2330931120089365ecd08274a0b7f5db0b90690"} Feb 25 11:42:05 crc kubenswrapper[4725]: I0225 11:42:05.527137 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38381cb6235599a0dff2ac72f2330931120089365ecd08274a0b7f5db0b90690" Feb 25 11:42:05 crc kubenswrapper[4725]: I0225 11:42:05.527213 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533662-s664f" Feb 25 11:42:06 crc kubenswrapper[4725]: I0225 11:42:06.062038 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533656-s2snx"] Feb 25 11:42:06 crc kubenswrapper[4725]: I0225 11:42:06.072265 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533656-s2snx"] Feb 25 11:42:07 crc kubenswrapper[4725]: I0225 11:42:07.251518 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fa4551-240f-4e9f-a3ba-ca8918695d0e" path="/var/lib/kubelet/pods/03fa4551-240f-4e9f-a3ba-ca8918695d0e/volumes" Feb 25 11:42:11 crc kubenswrapper[4725]: I0225 11:42:11.555519 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:42:11 crc kubenswrapper[4725]: I0225 11:42:11.556089 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:42:21 crc kubenswrapper[4725]: I0225 11:42:21.380475 4725 scope.go:117] "RemoveContainer" containerID="aff3b4b66c877b21570fdd97b46481ea9e3f970e38ed73b20fb49d13586fe5dc" Feb 25 11:42:41 crc kubenswrapper[4725]: I0225 11:42:41.555181 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:42:41 crc kubenswrapper[4725]: I0225 11:42:41.555855 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:43:11 crc kubenswrapper[4725]: I0225 11:43:11.555264 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:43:11 crc kubenswrapper[4725]: I0225 11:43:11.555956 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:43:11 crc kubenswrapper[4725]: I0225 11:43:11.556018 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:43:11 crc kubenswrapper[4725]: I0225 11:43:11.557030 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:43:11 crc kubenswrapper[4725]: I0225 11:43:11.557149 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" gracePeriod=600 Feb 25 11:43:11 crc kubenswrapper[4725]: I0225 11:43:11.986035 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e2b92e78-7b23-469e-9220-9ea38d9cba32" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 25 11:43:16 crc kubenswrapper[4725]: I0225 11:43:16.986270 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e2b92e78-7b23-469e-9220-9ea38d9cba32" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 25 11:43:16 crc kubenswrapper[4725]: I0225 11:43:16.986486 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e2b92e78-7b23-469e-9220-9ea38d9cba32" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Feb 25 11:43:18 crc kubenswrapper[4725]: I0225 11:43:18.464109 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="5a023b0b-cd51-47db-9fdf-74c673713272" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.174:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.112634 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-daemon-256sf_c4742f60-e555-4f96-be12-b9e46a857bd4/machine-config-daemon/11.log" Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.114228 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" exitCode=-1 Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.114281 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797"} Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.114341 4725 scope.go:117] "RemoveContainer" containerID="5f1352735f70d60c0184810ffaa1295427b1343d8d452f8dc314abfa9f82a71a" Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.804060 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e2b92e78-7b23-469e-9220-9ea38d9cba32" containerName="ceilometer-central-agent" probeResult="failure" output=< Feb 25 11:43:20 crc kubenswrapper[4725]: Unkown error: Expecting value: line 1 column 1 (char 0) Feb 25 11:43:20 crc kubenswrapper[4725]: > Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.804855 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.805905 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"0a2b2ffc58c711398dac44617bb4a676ca73ceffe9656290dc0fd05eec4dc2e8"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Feb 25 11:43:20 crc kubenswrapper[4725]: I0225 11:43:20.806167 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2b92e78-7b23-469e-9220-9ea38d9cba32" containerName="ceilometer-central-agent" containerID="cri-o://0a2b2ffc58c711398dac44617bb4a676ca73ceffe9656290dc0fd05eec4dc2e8" gracePeriod=30 Feb 25 11:43:20 crc kubenswrapper[4725]: E0225 11:43:20.841695 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:43:21 crc kubenswrapper[4725]: I0225 11:43:21.125400 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:43:21 crc kubenswrapper[4725]: E0225 11:43:21.125664 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:43:22 crc kubenswrapper[4725]: I0225 11:43:22.138984 4725 generic.go:334] "Generic (PLEG): container finished" podID="e2b92e78-7b23-469e-9220-9ea38d9cba32" containerID="0a2b2ffc58c711398dac44617bb4a676ca73ceffe9656290dc0fd05eec4dc2e8" exitCode=0 Feb 25 11:43:22 crc kubenswrapper[4725]: I0225 11:43:22.139066 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2b92e78-7b23-469e-9220-9ea38d9cba32","Type":"ContainerDied","Data":"0a2b2ffc58c711398dac44617bb4a676ca73ceffe9656290dc0fd05eec4dc2e8"} Feb 25 11:43:22 crc kubenswrapper[4725]: I0225 11:43:22.140462 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2b92e78-7b23-469e-9220-9ea38d9cba32","Type":"ContainerStarted","Data":"3fd2e84022c3659ebf99ec304cbfe91c7ffdc7415d5ec4e3c5d93182ea2f18b6"} Feb 25 11:43:33 crc kubenswrapper[4725]: I0225 11:43:33.225166 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:43:33 crc kubenswrapper[4725]: E0225 11:43:33.238383 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:43:48 crc kubenswrapper[4725]: I0225 11:43:48.224917 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:43:48 crc kubenswrapper[4725]: E0225 11:43:48.225742 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.141667 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533664-4l5bj"] Feb 25 11:44:00 crc kubenswrapper[4725]: E0225 11:44:00.142818 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91babeeb-e052-4777-bae5-6e19023d92df" containerName="oc" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.142859 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="91babeeb-e052-4777-bae5-6e19023d92df" containerName="oc" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.143229 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="91babeeb-e052-4777-bae5-6e19023d92df" containerName="oc" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.144307 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.147324 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.147348 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.151063 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533664-4l5bj"] Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.152020 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.268297 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqn57\" (UniqueName: \"kubernetes.io/projected/31181682-fa6c-417c-a2ba-08842aeca089-kube-api-access-hqn57\") pod \"auto-csr-approver-29533664-4l5bj\" (UID: \"31181682-fa6c-417c-a2ba-08842aeca089\") " pod="openshift-infra/auto-csr-approver-29533664-4l5bj" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.370655 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqn57\" (UniqueName: \"kubernetes.io/projected/31181682-fa6c-417c-a2ba-08842aeca089-kube-api-access-hqn57\") pod \"auto-csr-approver-29533664-4l5bj\" (UID: \"31181682-fa6c-417c-a2ba-08842aeca089\") " pod="openshift-infra/auto-csr-approver-29533664-4l5bj" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.405683 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqn57\" (UniqueName: \"kubernetes.io/projected/31181682-fa6c-417c-a2ba-08842aeca089-kube-api-access-hqn57\") pod \"auto-csr-approver-29533664-4l5bj\" (UID: \"31181682-fa6c-417c-a2ba-08842aeca089\") " pod="openshift-infra/auto-csr-approver-29533664-4l5bj" Feb 25 11:44:00 crc kubenswrapper[4725]: I0225 11:44:00.467418 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" Feb 25 11:44:01 crc kubenswrapper[4725]: I0225 11:44:01.055856 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533664-4l5bj"] Feb 25 11:44:01 crc kubenswrapper[4725]: W0225 11:44:01.068096 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31181682_fa6c_417c_a2ba_08842aeca089.slice/crio-43262d3edfe8883aa2c262e4e21b9ac4ed7e6f89c5c99d1a8d92f2ea87de3f73 WatchSource:0}: Error finding container 43262d3edfe8883aa2c262e4e21b9ac4ed7e6f89c5c99d1a8d92f2ea87de3f73: Status 404 returned error can't find the container with id 43262d3edfe8883aa2c262e4e21b9ac4ed7e6f89c5c99d1a8d92f2ea87de3f73 Feb 25 11:44:01 crc kubenswrapper[4725]: I0225 11:44:01.655034 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" event={"ID":"31181682-fa6c-417c-a2ba-08842aeca089","Type":"ContainerStarted","Data":"43262d3edfe8883aa2c262e4e21b9ac4ed7e6f89c5c99d1a8d92f2ea87de3f73"} Feb 25 11:44:02 crc kubenswrapper[4725]: I0225 11:44:02.241230 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:44:02 crc kubenswrapper[4725]: E0225 11:44:02.242906 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:44:02 crc kubenswrapper[4725]: I0225 11:44:02.664467 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" event={"ID":"31181682-fa6c-417c-a2ba-08842aeca089","Type":"ContainerStarted","Data":"8d8ed389819106127cc3bb70ddd24ce9497d1a930b889c31e1f14844b5aeee02"} Feb 25 11:44:03 crc kubenswrapper[4725]: I0225 11:44:03.677301 4725 generic.go:334] "Generic (PLEG): container finished" podID="31181682-fa6c-417c-a2ba-08842aeca089" containerID="8d8ed389819106127cc3bb70ddd24ce9497d1a930b889c31e1f14844b5aeee02" exitCode=0 Feb 25 11:44:03 crc kubenswrapper[4725]: I0225 11:44:03.677349 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" event={"ID":"31181682-fa6c-417c-a2ba-08842aeca089","Type":"ContainerDied","Data":"8d8ed389819106127cc3bb70ddd24ce9497d1a930b889c31e1f14844b5aeee02"} Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.045707 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.155265 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqn57\" (UniqueName: \"kubernetes.io/projected/31181682-fa6c-417c-a2ba-08842aeca089-kube-api-access-hqn57\") pod \"31181682-fa6c-417c-a2ba-08842aeca089\" (UID: \"31181682-fa6c-417c-a2ba-08842aeca089\") " Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.163613 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31181682-fa6c-417c-a2ba-08842aeca089-kube-api-access-hqn57" (OuterVolumeSpecName: "kube-api-access-hqn57") pod "31181682-fa6c-417c-a2ba-08842aeca089" (UID: "31181682-fa6c-417c-a2ba-08842aeca089"). InnerVolumeSpecName "kube-api-access-hqn57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.257895 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqn57\" (UniqueName: \"kubernetes.io/projected/31181682-fa6c-417c-a2ba-08842aeca089-kube-api-access-hqn57\") on node \"crc\" DevicePath \"\"" Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.694650 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" event={"ID":"31181682-fa6c-417c-a2ba-08842aeca089","Type":"ContainerDied","Data":"43262d3edfe8883aa2c262e4e21b9ac4ed7e6f89c5c99d1a8d92f2ea87de3f73"} Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.694688 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43262d3edfe8883aa2c262e4e21b9ac4ed7e6f89c5c99d1a8d92f2ea87de3f73" Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.694741 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533664-4l5bj" Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.755874 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533658-btgxw"] Feb 25 11:44:05 crc kubenswrapper[4725]: I0225 11:44:05.763762 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533658-btgxw"] Feb 25 11:44:07 crc kubenswrapper[4725]: I0225 11:44:07.237870 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf1ac0d-b1ef-4cb2-9a01-993c807a2865" path="/var/lib/kubelet/pods/6cf1ac0d-b1ef-4cb2-9a01-993c807a2865/volumes" Feb 25 11:44:14 crc kubenswrapper[4725]: I0225 11:44:14.225212 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:44:14 crc kubenswrapper[4725]: E0225 11:44:14.226411 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:44:21 crc kubenswrapper[4725]: I0225 11:44:21.474082 4725 scope.go:117] "RemoveContainer" containerID="b962be7edc0bacd0479ba97885087a92c712ae8583d496e0d4431b166d979358" Feb 25 11:44:25 crc kubenswrapper[4725]: I0225 11:44:25.231206 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:44:25 crc kubenswrapper[4725]: E0225 11:44:25.232156 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:44:38 crc kubenswrapper[4725]: I0225 11:44:38.224069 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:44:38 crc kubenswrapper[4725]: E0225 11:44:38.225657 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:44:53 crc kubenswrapper[4725]: I0225 11:44:53.224604 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:44:53 crc kubenswrapper[4725]: E0225 11:44:53.225684 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.159132 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct"] Feb 25 11:45:00 crc kubenswrapper[4725]: E0225 11:45:00.160015 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31181682-fa6c-417c-a2ba-08842aeca089" containerName="oc" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.160030 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="31181682-fa6c-417c-a2ba-08842aeca089" containerName="oc" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.160218 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="31181682-fa6c-417c-a2ba-08842aeca089" containerName="oc" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.160783 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.163401 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.163406 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.197922 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct"] Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.290158 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bea8ee2-f811-4dce-a140-624bc53c8b34-secret-volume\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.290206 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh746\" (UniqueName: \"kubernetes.io/projected/8bea8ee2-f811-4dce-a140-624bc53c8b34-kube-api-access-nh746\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.290253 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bea8ee2-f811-4dce-a140-624bc53c8b34-config-volume\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.392991 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bea8ee2-f811-4dce-a140-624bc53c8b34-secret-volume\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.393070 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh746\" (UniqueName: \"kubernetes.io/projected/8bea8ee2-f811-4dce-a140-624bc53c8b34-kube-api-access-nh746\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.393117 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bea8ee2-f811-4dce-a140-624bc53c8b34-config-volume\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.394281 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bea8ee2-f811-4dce-a140-624bc53c8b34-config-volume\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.399216 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bea8ee2-f811-4dce-a140-624bc53c8b34-secret-volume\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.411191 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh746\" (UniqueName: \"kubernetes.io/projected/8bea8ee2-f811-4dce-a140-624bc53c8b34-kube-api-access-nh746\") pod \"collect-profiles-29533665-n9pct\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.482414 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:00 crc kubenswrapper[4725]: I0225 11:45:00.912957 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct"] Feb 25 11:45:01 crc kubenswrapper[4725]: I0225 11:45:01.233538 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" event={"ID":"8bea8ee2-f811-4dce-a140-624bc53c8b34","Type":"ContainerStarted","Data":"5588aa2e44ce0ab0d06a8ad949759235d4c780cc78601704882f0e2e32637448"} Feb 25 11:45:01 crc kubenswrapper[4725]: I0225 11:45:01.233873 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" event={"ID":"8bea8ee2-f811-4dce-a140-624bc53c8b34","Type":"ContainerStarted","Data":"718b386084d5b4d7733e75499e70bb6fb8f49b3c60b889af511c95fe7a10602b"} Feb 25 11:45:01 crc kubenswrapper[4725]: I0225 11:45:01.249085 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" podStartSLOduration=1.249062143 podStartE2EDuration="1.249062143s" podCreationTimestamp="2026-02-25 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 11:45:01.245852988 +0000 UTC m=+3126.744435033" watchObservedRunningTime="2026-02-25 11:45:01.249062143 +0000 UTC m=+3126.747644188" Feb 25 11:45:02 crc kubenswrapper[4725]: I0225 11:45:02.242685 4725 generic.go:334] "Generic (PLEG): container finished" podID="8bea8ee2-f811-4dce-a140-624bc53c8b34" containerID="5588aa2e44ce0ab0d06a8ad949759235d4c780cc78601704882f0e2e32637448" exitCode=0 Feb 25 11:45:02 crc kubenswrapper[4725]: I0225 11:45:02.242735 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" event={"ID":"8bea8ee2-f811-4dce-a140-624bc53c8b34","Type":"ContainerDied","Data":"5588aa2e44ce0ab0d06a8ad949759235d4c780cc78601704882f0e2e32637448"} Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.803913 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.860105 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh746\" (UniqueName: \"kubernetes.io/projected/8bea8ee2-f811-4dce-a140-624bc53c8b34-kube-api-access-nh746\") pod \"8bea8ee2-f811-4dce-a140-624bc53c8b34\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.860354 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bea8ee2-f811-4dce-a140-624bc53c8b34-secret-volume\") pod \"8bea8ee2-f811-4dce-a140-624bc53c8b34\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.860457 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bea8ee2-f811-4dce-a140-624bc53c8b34-config-volume\") pod \"8bea8ee2-f811-4dce-a140-624bc53c8b34\" (UID: \"8bea8ee2-f811-4dce-a140-624bc53c8b34\") " Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.861308 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bea8ee2-f811-4dce-a140-624bc53c8b34-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bea8ee2-f811-4dce-a140-624bc53c8b34" (UID: "8bea8ee2-f811-4dce-a140-624bc53c8b34"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.868634 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bea8ee2-f811-4dce-a140-624bc53c8b34-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8bea8ee2-f811-4dce-a140-624bc53c8b34" (UID: "8bea8ee2-f811-4dce-a140-624bc53c8b34"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.869772 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bea8ee2-f811-4dce-a140-624bc53c8b34-kube-api-access-nh746" (OuterVolumeSpecName: "kube-api-access-nh746") pod "8bea8ee2-f811-4dce-a140-624bc53c8b34" (UID: "8bea8ee2-f811-4dce-a140-624bc53c8b34"). InnerVolumeSpecName "kube-api-access-nh746". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.962641 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bea8ee2-f811-4dce-a140-624bc53c8b34-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.962671 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bea8ee2-f811-4dce-a140-624bc53c8b34-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 11:45:03 crc kubenswrapper[4725]: I0225 11:45:03.962681 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh746\" (UniqueName: \"kubernetes.io/projected/8bea8ee2-f811-4dce-a140-624bc53c8b34-kube-api-access-nh746\") on node \"crc\" DevicePath \"\"" Feb 25 11:45:04 crc kubenswrapper[4725]: I0225 11:45:04.224600 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:45:04 crc kubenswrapper[4725]: E0225 11:45:04.225234 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:45:04 crc kubenswrapper[4725]: I0225 11:45:04.338683 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" event={"ID":"8bea8ee2-f811-4dce-a140-624bc53c8b34","Type":"ContainerDied","Data":"718b386084d5b4d7733e75499e70bb6fb8f49b3c60b889af511c95fe7a10602b"} Feb 25 11:45:04 crc kubenswrapper[4725]: I0225 11:45:04.338738 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="718b386084d5b4d7733e75499e70bb6fb8f49b3c60b889af511c95fe7a10602b" Feb 25 11:45:04 crc kubenswrapper[4725]: I0225 11:45:04.338748 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533665-n9pct" Feb 25 11:45:04 crc kubenswrapper[4725]: I0225 11:45:04.345088 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq"] Feb 25 11:45:04 crc kubenswrapper[4725]: I0225 11:45:04.359470 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533620-8q6wq"] Feb 25 11:45:05 crc kubenswrapper[4725]: I0225 11:45:05.234620 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc7e258-78da-488a-923a-d133cc3a1d03" path="/var/lib/kubelet/pods/ffc7e258-78da-488a-923a-d133cc3a1d03/volumes" Feb 25 11:45:18 crc kubenswrapper[4725]: I0225 11:45:18.224671 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:45:18 crc kubenswrapper[4725]: E0225 11:45:18.226001 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:45:21 crc kubenswrapper[4725]: I0225 11:45:21.559343 4725 scope.go:117] "RemoveContainer" containerID="1c2c73cb9a136828ce516615f2b242b82dbd4b8c6b7f647249caf14c33944f67" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.502455 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hd65m"] Feb 25 11:45:28 crc kubenswrapper[4725]: E0225 11:45:28.503826 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bea8ee2-f811-4dce-a140-624bc53c8b34" containerName="collect-profiles" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.503875 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bea8ee2-f811-4dce-a140-624bc53c8b34" containerName="collect-profiles" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.504224 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bea8ee2-f811-4dce-a140-624bc53c8b34" containerName="collect-profiles" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.506452 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.519021 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hd65m"] Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.639708 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-utilities\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.639860 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgvl\" (UniqueName: \"kubernetes.io/projected/b9e62ec5-9be6-457f-882b-2f2e3f03df54-kube-api-access-xlgvl\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.639930 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-catalog-content\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.741941 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-utilities\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.742053 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgvl\" (UniqueName: \"kubernetes.io/projected/b9e62ec5-9be6-457f-882b-2f2e3f03df54-kube-api-access-xlgvl\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.742094 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-catalog-content\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.742462 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-utilities\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.742528 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-catalog-content\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.763549 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgvl\" (UniqueName: \"kubernetes.io/projected/b9e62ec5-9be6-457f-882b-2f2e3f03df54-kube-api-access-xlgvl\") pod \"redhat-operators-hd65m\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:28 crc kubenswrapper[4725]: I0225 11:45:28.840942 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:29 crc kubenswrapper[4725]: I0225 11:45:29.308801 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hd65m"] Feb 25 11:45:29 crc kubenswrapper[4725]: I0225 11:45:29.602595 4725 generic.go:334] "Generic (PLEG): container finished" podID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerID="3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146" exitCode=0 Feb 25 11:45:29 crc kubenswrapper[4725]: I0225 11:45:29.602675 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd65m" event={"ID":"b9e62ec5-9be6-457f-882b-2f2e3f03df54","Type":"ContainerDied","Data":"3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146"} Feb 25 11:45:29 crc kubenswrapper[4725]: I0225 11:45:29.602949 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd65m" event={"ID":"b9e62ec5-9be6-457f-882b-2f2e3f03df54","Type":"ContainerStarted","Data":"15522ef9376a2027b291e288348990f8b70ea97e73b27ccfcc6f83176eeea762"} Feb 25 11:45:29 crc kubenswrapper[4725]: I0225 11:45:29.604524 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:45:30 crc kubenswrapper[4725]: I0225 11:45:30.621326 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd65m" event={"ID":"b9e62ec5-9be6-457f-882b-2f2e3f03df54","Type":"ContainerStarted","Data":"3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775"} Feb 25 11:45:31 crc kubenswrapper[4725]: I0225 11:45:31.635081 4725 generic.go:334] "Generic (PLEG): container finished" podID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerID="3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775" exitCode=0 Feb 25 11:45:31 crc kubenswrapper[4725]: I0225 11:45:31.635130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd65m" event={"ID":"b9e62ec5-9be6-457f-882b-2f2e3f03df54","Type":"ContainerDied","Data":"3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775"} Feb 25 11:45:33 crc kubenswrapper[4725]: I0225 11:45:33.225438 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:45:33 crc kubenswrapper[4725]: E0225 11:45:33.226788 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:45:38 crc kubenswrapper[4725]: I0225 11:45:38.717311 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd65m" event={"ID":"b9e62ec5-9be6-457f-882b-2f2e3f03df54","Type":"ContainerStarted","Data":"21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d"} Feb 25 11:45:38 crc kubenswrapper[4725]: I0225 11:45:38.737410 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hd65m" podStartSLOduration=2.377493574 podStartE2EDuration="10.737392242s" podCreationTimestamp="2026-02-25 11:45:28 +0000 UTC" firstStartedPulling="2026-02-25 11:45:29.604267728 +0000 UTC m=+3155.102849753" lastFinishedPulling="2026-02-25 11:45:37.964166386 +0000 UTC m=+3163.462748421" observedRunningTime="2026-02-25 11:45:38.732534193 +0000 UTC m=+3164.231116268" watchObservedRunningTime="2026-02-25 11:45:38.737392242 +0000 UTC m=+3164.235974277" Feb 25 11:45:38 crc kubenswrapper[4725]: I0225 11:45:38.842014 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:38 crc kubenswrapper[4725]: I0225 11:45:38.842592 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:39 crc kubenswrapper[4725]: I0225 11:45:39.918345 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hd65m" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="registry-server" probeResult="failure" output=< Feb 25 11:45:39 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 25 11:45:39 crc kubenswrapper[4725]: > Feb 25 11:45:46 crc kubenswrapper[4725]: I0225 11:45:46.224350 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:45:46 crc kubenswrapper[4725]: E0225 11:45:46.225457 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:45:48 crc kubenswrapper[4725]: I0225 11:45:48.891975 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:48 crc kubenswrapper[4725]: I0225 11:45:48.961627 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:49 crc kubenswrapper[4725]: I0225 11:45:49.141274 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hd65m"] Feb 25 11:45:50 crc kubenswrapper[4725]: I0225 11:45:50.826420 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hd65m" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="registry-server" containerID="cri-o://21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d" gracePeriod=2 Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.320785 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.396840 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-catalog-content\") pod \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.396969 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-utilities\") pod \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.397027 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgvl\" (UniqueName: \"kubernetes.io/projected/b9e62ec5-9be6-457f-882b-2f2e3f03df54-kube-api-access-xlgvl\") pod \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\" (UID: \"b9e62ec5-9be6-457f-882b-2f2e3f03df54\") " Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.398356 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-utilities" (OuterVolumeSpecName: "utilities") pod "b9e62ec5-9be6-457f-882b-2f2e3f03df54" (UID: "b9e62ec5-9be6-457f-882b-2f2e3f03df54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.403767 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e62ec5-9be6-457f-882b-2f2e3f03df54-kube-api-access-xlgvl" (OuterVolumeSpecName: "kube-api-access-xlgvl") pod "b9e62ec5-9be6-457f-882b-2f2e3f03df54" (UID: "b9e62ec5-9be6-457f-882b-2f2e3f03df54"). InnerVolumeSpecName "kube-api-access-xlgvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.500101 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.500140 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgvl\" (UniqueName: \"kubernetes.io/projected/b9e62ec5-9be6-457f-882b-2f2e3f03df54-kube-api-access-xlgvl\") on node \"crc\" DevicePath \"\"" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.534554 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9e62ec5-9be6-457f-882b-2f2e3f03df54" (UID: "b9e62ec5-9be6-457f-882b-2f2e3f03df54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.601305 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e62ec5-9be6-457f-882b-2f2e3f03df54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.842238 4725 generic.go:334] "Generic (PLEG): container finished" podID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerID="21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d" exitCode=0 Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.842301 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd65m" event={"ID":"b9e62ec5-9be6-457f-882b-2f2e3f03df54","Type":"ContainerDied","Data":"21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d"} Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.842331 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd65m" event={"ID":"b9e62ec5-9be6-457f-882b-2f2e3f03df54","Type":"ContainerDied","Data":"15522ef9376a2027b291e288348990f8b70ea97e73b27ccfcc6f83176eeea762"} Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.842356 4725 scope.go:117] "RemoveContainer" containerID="21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.842363 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd65m" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.890069 4725 scope.go:117] "RemoveContainer" containerID="3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.902670 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hd65m"] Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.916337 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hd65m"] Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.931745 4725 scope.go:117] "RemoveContainer" containerID="3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.991205 4725 scope.go:117] "RemoveContainer" containerID="21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d" Feb 25 11:45:51 crc kubenswrapper[4725]: E0225 11:45:51.992054 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d\": container with ID starting with 21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d not found: ID does not exist" containerID="21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.992119 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d"} err="failed to get container status \"21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d\": rpc error: code = NotFound desc = could not find container \"21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d\": container with ID starting with 21c5bac52d90b787f8c67a7890d465a8828d88ca4dbb60f60bd5f068bd60834d not found: ID does not exist" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.992164 4725 scope.go:117] "RemoveContainer" containerID="3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775" Feb 25 11:45:51 crc kubenswrapper[4725]: E0225 11:45:51.993618 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775\": container with ID starting with 3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775 not found: ID does not exist" containerID="3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.993669 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775"} err="failed to get container status \"3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775\": rpc error: code = NotFound desc = could not find container \"3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775\": container with ID starting with 3eb161cb6c7d2bed1328c34e17a1e94178a18cab8a9024611f6af4edb7aa5775 not found: ID does not exist" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.993701 4725 scope.go:117] "RemoveContainer" containerID="3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146" Feb 25 11:45:51 crc kubenswrapper[4725]: E0225 11:45:51.995374 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146\": container with ID starting with 3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146 not found: ID does not exist" containerID="3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146" Feb 25 11:45:51 crc kubenswrapper[4725]: I0225 11:45:51.995424 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146"} err="failed to get container status \"3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146\": rpc error: code = NotFound desc = could not find container \"3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146\": container with ID starting with 3c90f4ac6762970a7baaa0ab540fb7db94b61bb5985b702e8baa3b4506c38146 not found: ID does not exist" Feb 25 11:45:53 crc kubenswrapper[4725]: I0225 11:45:53.263850 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" path="/var/lib/kubelet/pods/b9e62ec5-9be6-457f-882b-2f2e3f03df54/volumes" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.166546 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533666-fs596"] Feb 25 11:46:00 crc kubenswrapper[4725]: E0225 11:46:00.168045 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="registry-server" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.168074 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="registry-server" Feb 25 11:46:00 crc kubenswrapper[4725]: E0225 11:46:00.168110 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="extract-content" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.168123 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="extract-content" Feb 25 11:46:00 crc kubenswrapper[4725]: E0225 11:46:00.168151 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="extract-utilities" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.168163 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="extract-utilities" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.168490 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e62ec5-9be6-457f-882b-2f2e3f03df54" containerName="registry-server" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.169674 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533666-fs596" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.174988 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.175052 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.175606 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.180507 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533666-fs596"] Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.224436 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:46:00 crc kubenswrapper[4725]: E0225 11:46:00.224949 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.311234 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kp8j\" (UniqueName: \"kubernetes.io/projected/5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2-kube-api-access-5kp8j\") pod \"auto-csr-approver-29533666-fs596\" (UID: \"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2\") " pod="openshift-infra/auto-csr-approver-29533666-fs596" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.415592 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kp8j\" (UniqueName: \"kubernetes.io/projected/5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2-kube-api-access-5kp8j\") pod \"auto-csr-approver-29533666-fs596\" (UID: \"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2\") " pod="openshift-infra/auto-csr-approver-29533666-fs596" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.451720 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kp8j\" (UniqueName: \"kubernetes.io/projected/5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2-kube-api-access-5kp8j\") pod \"auto-csr-approver-29533666-fs596\" (UID: \"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2\") " pod="openshift-infra/auto-csr-approver-29533666-fs596" Feb 25 11:46:00 crc kubenswrapper[4725]: I0225 11:46:00.523592 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533666-fs596" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:00.998675 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533666-fs596"] Feb 25 11:46:01 crc kubenswrapper[4725]: W0225 11:46:01.010230 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff9de35_0fa6_4ecf_93f0_0d24a39cabe2.slice/crio-ca9f2e1ba05efebc9e793908be081a07da936c73692bed25c77e156330c98c86 WatchSource:0}: Error finding container ca9f2e1ba05efebc9e793908be081a07da936c73692bed25c77e156330c98c86: Status 404 returned error can't find the container with id ca9f2e1ba05efebc9e793908be081a07da936c73692bed25c77e156330c98c86 Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.495947 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b65dm"] Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.498305 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.510554 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b65dm"] Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.636045 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j2ck\" (UniqueName: \"kubernetes.io/projected/0b034a3e-358e-4ee5-b771-1c72919d4831-kube-api-access-9j2ck\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.636090 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-utilities\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.636314 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-catalog-content\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.738988 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j2ck\" (UniqueName: \"kubernetes.io/projected/0b034a3e-358e-4ee5-b771-1c72919d4831-kube-api-access-9j2ck\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.739038 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-utilities\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.739113 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-catalog-content\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.739574 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-utilities\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.739600 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-catalog-content\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.770181 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j2ck\" (UniqueName: \"kubernetes.io/projected/0b034a3e-358e-4ee5-b771-1c72919d4831-kube-api-access-9j2ck\") pod \"certified-operators-b65dm\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.832414 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:01 crc kubenswrapper[4725]: I0225 11:46:01.956409 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533666-fs596" event={"ID":"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2","Type":"ContainerStarted","Data":"ca9f2e1ba05efebc9e793908be081a07da936c73692bed25c77e156330c98c86"} Feb 25 11:46:02 crc kubenswrapper[4725]: I0225 11:46:02.343343 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b65dm"] Feb 25 11:46:02 crc kubenswrapper[4725]: I0225 11:46:02.969355 4725 generic.go:334] "Generic (PLEG): container finished" podID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerID="12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af" exitCode=0 Feb 25 11:46:02 crc kubenswrapper[4725]: I0225 11:46:02.969583 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b65dm" event={"ID":"0b034a3e-358e-4ee5-b771-1c72919d4831","Type":"ContainerDied","Data":"12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af"} Feb 25 11:46:02 crc kubenswrapper[4725]: I0225 11:46:02.970035 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b65dm" event={"ID":"0b034a3e-358e-4ee5-b771-1c72919d4831","Type":"ContainerStarted","Data":"0841d74d8d46992dc3197b46efae3afb882e25b49e0daa75a27bf5f0acc57fba"} Feb 25 11:46:03 crc kubenswrapper[4725]: I0225 11:46:03.987214 4725 generic.go:334] "Generic (PLEG): container finished" podID="5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2" containerID="28ae3e7458ed12bd309f567175023ec4417e9b8f6574d6f2eada4845763b83e3" exitCode=0 Feb 25 11:46:03 crc kubenswrapper[4725]: I0225 11:46:03.987408 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533666-fs596" event={"ID":"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2","Type":"ContainerDied","Data":"28ae3e7458ed12bd309f567175023ec4417e9b8f6574d6f2eada4845763b83e3"} Feb 25 11:46:05 crc kubenswrapper[4725]: I0225 11:46:04.999654 4725 generic.go:334] "Generic (PLEG): container finished" podID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerID="4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de" exitCode=0 Feb 25 11:46:05 crc kubenswrapper[4725]: I0225 11:46:04.999719 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b65dm" event={"ID":"0b034a3e-358e-4ee5-b771-1c72919d4831","Type":"ContainerDied","Data":"4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de"} Feb 25 11:46:05 crc kubenswrapper[4725]: I0225 11:46:05.428990 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533666-fs596" Feb 25 11:46:05 crc kubenswrapper[4725]: I0225 11:46:05.617851 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kp8j\" (UniqueName: \"kubernetes.io/projected/5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2-kube-api-access-5kp8j\") pod \"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2\" (UID: \"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2\") " Feb 25 11:46:05 crc kubenswrapper[4725]: I0225 11:46:05.627653 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2-kube-api-access-5kp8j" (OuterVolumeSpecName: "kube-api-access-5kp8j") pod "5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2" (UID: "5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2"). InnerVolumeSpecName "kube-api-access-5kp8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:46:05 crc kubenswrapper[4725]: I0225 11:46:05.720500 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kp8j\" (UniqueName: \"kubernetes.io/projected/5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2-kube-api-access-5kp8j\") on node \"crc\" DevicePath \"\"" Feb 25 11:46:06 crc kubenswrapper[4725]: I0225 11:46:06.009754 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533666-fs596" event={"ID":"5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2","Type":"ContainerDied","Data":"ca9f2e1ba05efebc9e793908be081a07da936c73692bed25c77e156330c98c86"} Feb 25 11:46:06 crc kubenswrapper[4725]: I0225 11:46:06.009788 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9f2e1ba05efebc9e793908be081a07da936c73692bed25c77e156330c98c86" Feb 25 11:46:06 crc kubenswrapper[4725]: I0225 11:46:06.009852 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533666-fs596" Feb 25 11:46:06 crc kubenswrapper[4725]: I0225 11:46:06.018372 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b65dm" event={"ID":"0b034a3e-358e-4ee5-b771-1c72919d4831","Type":"ContainerStarted","Data":"d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38"} Feb 25 11:46:06 crc kubenswrapper[4725]: I0225 11:46:06.045225 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b65dm" podStartSLOduration=2.415587987 podStartE2EDuration="5.045206664s" podCreationTimestamp="2026-02-25 11:46:01 +0000 UTC" firstStartedPulling="2026-02-25 11:46:02.972013861 +0000 UTC m=+3188.470595896" lastFinishedPulling="2026-02-25 11:46:05.601632548 +0000 UTC m=+3191.100214573" observedRunningTime="2026-02-25 11:46:06.037871091 +0000 UTC m=+3191.536453116" watchObservedRunningTime="2026-02-25 11:46:06.045206664 +0000 UTC m=+3191.543788689" Feb 25 11:46:06 crc kubenswrapper[4725]: I0225 11:46:06.497164 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533660-2kc5z"] Feb 25 11:46:06 crc kubenswrapper[4725]: I0225 11:46:06.506119 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533660-2kc5z"] Feb 25 11:46:07 crc kubenswrapper[4725]: I0225 11:46:07.241438 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37abe97b-f606-4716-8dd7-957b23922f42" path="/var/lib/kubelet/pods/37abe97b-f606-4716-8dd7-957b23922f42/volumes" Feb 25 11:46:11 crc kubenswrapper[4725]: I0225 11:46:11.833337 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:11 crc kubenswrapper[4725]: I0225 11:46:11.833910 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:11 crc kubenswrapper[4725]: I0225 11:46:11.890256 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:12 crc kubenswrapper[4725]: I0225 11:46:12.139599 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:12 crc kubenswrapper[4725]: I0225 11:46:12.190406 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b65dm"] Feb 25 11:46:12 crc kubenswrapper[4725]: I0225 11:46:12.224774 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:46:12 crc kubenswrapper[4725]: E0225 11:46:12.225084 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.098275 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b65dm" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="registry-server" containerID="cri-o://d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38" gracePeriod=2 Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.612630 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.803020 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-utilities\") pod \"0b034a3e-358e-4ee5-b771-1c72919d4831\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.803504 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-catalog-content\") pod \"0b034a3e-358e-4ee5-b771-1c72919d4831\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.804245 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-utilities" (OuterVolumeSpecName: "utilities") pod "0b034a3e-358e-4ee5-b771-1c72919d4831" (UID: "0b034a3e-358e-4ee5-b771-1c72919d4831"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.806070 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j2ck\" (UniqueName: \"kubernetes.io/projected/0b034a3e-358e-4ee5-b771-1c72919d4831-kube-api-access-9j2ck\") pod \"0b034a3e-358e-4ee5-b771-1c72919d4831\" (UID: \"0b034a3e-358e-4ee5-b771-1c72919d4831\") " Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.806846 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.811708 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b034a3e-358e-4ee5-b771-1c72919d4831-kube-api-access-9j2ck" (OuterVolumeSpecName: "kube-api-access-9j2ck") pod "0b034a3e-358e-4ee5-b771-1c72919d4831" (UID: "0b034a3e-358e-4ee5-b771-1c72919d4831"). InnerVolumeSpecName "kube-api-access-9j2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.859697 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b034a3e-358e-4ee5-b771-1c72919d4831" (UID: "0b034a3e-358e-4ee5-b771-1c72919d4831"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.907506 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j2ck\" (UniqueName: \"kubernetes.io/projected/0b034a3e-358e-4ee5-b771-1c72919d4831-kube-api-access-9j2ck\") on node \"crc\" DevicePath \"\"" Feb 25 11:46:14 crc kubenswrapper[4725]: I0225 11:46:14.907535 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b034a3e-358e-4ee5-b771-1c72919d4831-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.110884 4725 generic.go:334] "Generic (PLEG): container finished" podID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerID="d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38" exitCode=0 Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.110936 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b65dm" event={"ID":"0b034a3e-358e-4ee5-b771-1c72919d4831","Type":"ContainerDied","Data":"d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38"} Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.110975 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b65dm" event={"ID":"0b034a3e-358e-4ee5-b771-1c72919d4831","Type":"ContainerDied","Data":"0841d74d8d46992dc3197b46efae3afb882e25b49e0daa75a27bf5f0acc57fba"} Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.111002 4725 scope.go:117] "RemoveContainer" containerID="d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.111867 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b65dm" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.130541 4725 scope.go:117] "RemoveContainer" containerID="4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.152641 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b65dm"] Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.163360 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b65dm"] Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.169706 4725 scope.go:117] "RemoveContainer" containerID="12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.202187 4725 scope.go:117] "RemoveContainer" containerID="d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38" Feb 25 11:46:15 crc kubenswrapper[4725]: E0225 11:46:15.202788 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38\": container with ID starting with d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38 not found: ID does not exist" containerID="d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.202863 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38"} err="failed to get container status \"d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38\": rpc error: code = NotFound desc = could not find container \"d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38\": container with ID starting with d28f0eef1acb5540fbad48e8a5bee5cd73f0a1da58e8c29d73644e84dafd9e38 not found: ID does not exist" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.202890 4725 scope.go:117] "RemoveContainer" containerID="4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de" Feb 25 11:46:15 crc kubenswrapper[4725]: E0225 11:46:15.203402 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de\": container with ID starting with 4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de not found: ID does not exist" containerID="4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.203453 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de"} err="failed to get container status \"4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de\": rpc error: code = NotFound desc = could not find container \"4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de\": container with ID starting with 4c2ece6f4eb5c08434b691a1bb72fdcf43611ceee98e5f296541c96cc27ad1de not found: ID does not exist" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.203483 4725 scope.go:117] "RemoveContainer" containerID="12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af" Feb 25 11:46:15 crc kubenswrapper[4725]: E0225 11:46:15.203905 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af\": container with ID starting with 12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af not found: ID does not exist" containerID="12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.203953 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af"} err="failed to get container status \"12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af\": rpc error: code = NotFound desc = could not find container \"12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af\": container with ID starting with 12dc7db7201cc9fdea739f0bebc895ed1a5d1e969b421787ba2aee60bef614af not found: ID does not exist" Feb 25 11:46:15 crc kubenswrapper[4725]: I0225 11:46:15.236447 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" path="/var/lib/kubelet/pods/0b034a3e-358e-4ee5-b771-1c72919d4831/volumes" Feb 25 11:46:21 crc kubenswrapper[4725]: I0225 11:46:21.668277 4725 scope.go:117] "RemoveContainer" containerID="449ae2c78d7d9e832975ddc4432326b75d9aae940519a52a8b6e23236fac8c88" Feb 25 11:46:26 crc kubenswrapper[4725]: I0225 11:46:26.225054 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:46:26 crc kubenswrapper[4725]: E0225 11:46:26.226224 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:46:40 crc kubenswrapper[4725]: I0225 11:46:40.224247 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:46:40 crc kubenswrapper[4725]: E0225 11:46:40.225097 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:46:55 crc kubenswrapper[4725]: I0225 11:46:55.241753 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:46:55 crc kubenswrapper[4725]: E0225 11:46:55.245101 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:47:10 crc kubenswrapper[4725]: I0225 11:47:10.225201 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:47:10 crc kubenswrapper[4725]: E0225 11:47:10.225852 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:47:24 crc kubenswrapper[4725]: I0225 11:47:24.224109 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:47:24 crc kubenswrapper[4725]: E0225 11:47:24.226935 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:47:39 crc kubenswrapper[4725]: I0225 11:47:39.225372 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:47:39 crc kubenswrapper[4725]: E0225 11:47:39.226210 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:47:51 crc kubenswrapper[4725]: I0225 11:47:51.224676 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:47:51 crc kubenswrapper[4725]: E0225 11:47:51.225507 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.233377 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533668-q9582"] Feb 25 11:48:00 crc kubenswrapper[4725]: E0225 11:48:00.235099 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="extract-content" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.235137 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="extract-content" Feb 25 11:48:00 crc kubenswrapper[4725]: E0225 11:48:00.235178 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="extract-utilities" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.235196 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="extract-utilities" Feb 25 11:48:00 crc kubenswrapper[4725]: E0225 11:48:00.235234 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2" containerName="oc" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.235251 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2" containerName="oc" Feb 25 11:48:00 crc kubenswrapper[4725]: E0225 11:48:00.235277 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="registry-server" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.235294 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="registry-server" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.235757 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b034a3e-358e-4ee5-b771-1c72919d4831" containerName="registry-server" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.235806 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2" containerName="oc" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.237291 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533668-q9582" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.242984 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.243196 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.244263 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.248801 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533668-q9582"] Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.363308 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7dbw\" (UniqueName: \"kubernetes.io/projected/9f49a014-2668-4dc4-bf75-c7c41a71c209-kube-api-access-r7dbw\") pod \"auto-csr-approver-29533668-q9582\" (UID: \"9f49a014-2668-4dc4-bf75-c7c41a71c209\") " pod="openshift-infra/auto-csr-approver-29533668-q9582" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.465232 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7dbw\" (UniqueName: \"kubernetes.io/projected/9f49a014-2668-4dc4-bf75-c7c41a71c209-kube-api-access-r7dbw\") pod \"auto-csr-approver-29533668-q9582\" (UID: \"9f49a014-2668-4dc4-bf75-c7c41a71c209\") " pod="openshift-infra/auto-csr-approver-29533668-q9582" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.492607 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7dbw\" (UniqueName: \"kubernetes.io/projected/9f49a014-2668-4dc4-bf75-c7c41a71c209-kube-api-access-r7dbw\") pod \"auto-csr-approver-29533668-q9582\" (UID: \"9f49a014-2668-4dc4-bf75-c7c41a71c209\") " pod="openshift-infra/auto-csr-approver-29533668-q9582" Feb 25 11:48:00 crc kubenswrapper[4725]: I0225 11:48:00.589568 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533668-q9582" Feb 25 11:48:01 crc kubenswrapper[4725]: I0225 11:48:01.077900 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533668-q9582"] Feb 25 11:48:01 crc kubenswrapper[4725]: I0225 11:48:01.162137 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533668-q9582" event={"ID":"9f49a014-2668-4dc4-bf75-c7c41a71c209","Type":"ContainerStarted","Data":"37f660c0daa1de7d24158cbd6663d4b40039b9f2f5613541229074b549272f93"} Feb 25 11:48:02 crc kubenswrapper[4725]: I0225 11:48:02.225045 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:48:02 crc kubenswrapper[4725]: E0225 11:48:02.225594 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:48:03 crc kubenswrapper[4725]: I0225 11:48:03.192757 4725 generic.go:334] "Generic (PLEG): container finished" podID="9f49a014-2668-4dc4-bf75-c7c41a71c209" containerID="579087ef6d0d0dadc41f59207ebb8127e2674b95834ca36ce5ba7ce3b109a51d" exitCode=0 Feb 25 11:48:03 crc kubenswrapper[4725]: I0225 11:48:03.193053 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533668-q9582" event={"ID":"9f49a014-2668-4dc4-bf75-c7c41a71c209","Type":"ContainerDied","Data":"579087ef6d0d0dadc41f59207ebb8127e2674b95834ca36ce5ba7ce3b109a51d"} Feb 25 11:48:04 crc kubenswrapper[4725]: I0225 11:48:04.627760 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533668-q9582" Feb 25 11:48:04 crc kubenswrapper[4725]: I0225 11:48:04.749286 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7dbw\" (UniqueName: \"kubernetes.io/projected/9f49a014-2668-4dc4-bf75-c7c41a71c209-kube-api-access-r7dbw\") pod \"9f49a014-2668-4dc4-bf75-c7c41a71c209\" (UID: \"9f49a014-2668-4dc4-bf75-c7c41a71c209\") " Feb 25 11:48:04 crc kubenswrapper[4725]: I0225 11:48:04.760098 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f49a014-2668-4dc4-bf75-c7c41a71c209-kube-api-access-r7dbw" (OuterVolumeSpecName: "kube-api-access-r7dbw") pod "9f49a014-2668-4dc4-bf75-c7c41a71c209" (UID: "9f49a014-2668-4dc4-bf75-c7c41a71c209"). InnerVolumeSpecName "kube-api-access-r7dbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:48:04 crc kubenswrapper[4725]: I0225 11:48:04.852167 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7dbw\" (UniqueName: \"kubernetes.io/projected/9f49a014-2668-4dc4-bf75-c7c41a71c209-kube-api-access-r7dbw\") on node \"crc\" DevicePath \"\"" Feb 25 11:48:05 crc kubenswrapper[4725]: I0225 11:48:05.217459 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533668-q9582" event={"ID":"9f49a014-2668-4dc4-bf75-c7c41a71c209","Type":"ContainerDied","Data":"37f660c0daa1de7d24158cbd6663d4b40039b9f2f5613541229074b549272f93"} Feb 25 11:48:05 crc kubenswrapper[4725]: I0225 11:48:05.217504 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37f660c0daa1de7d24158cbd6663d4b40039b9f2f5613541229074b549272f93" Feb 25 11:48:05 crc kubenswrapper[4725]: I0225 11:48:05.218050 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533668-q9582" Feb 25 11:48:05 crc kubenswrapper[4725]: I0225 11:48:05.705625 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533662-s664f"] Feb 25 11:48:05 crc kubenswrapper[4725]: I0225 11:48:05.718258 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533662-s664f"] Feb 25 11:48:07 crc kubenswrapper[4725]: I0225 11:48:07.238962 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91babeeb-e052-4777-bae5-6e19023d92df" path="/var/lib/kubelet/pods/91babeeb-e052-4777-bae5-6e19023d92df/volumes" Feb 25 11:48:15 crc kubenswrapper[4725]: I0225 11:48:15.232559 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:48:15 crc kubenswrapper[4725]: E0225 11:48:15.233514 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:48:21 crc kubenswrapper[4725]: I0225 11:48:21.793844 4725 scope.go:117] "RemoveContainer" containerID="441aa5a172dbc380697f94f1ed92f72a71085d2ee92200935117e576f6b931e6" Feb 25 11:48:29 crc kubenswrapper[4725]: I0225 11:48:29.224567 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:48:29 crc kubenswrapper[4725]: I0225 11:48:29.486150 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"55972f279b171bec5e6d0dee8be26569a49cd30b83e5c71721b156cab7b1e025"} Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.141666 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533670-klrw7"] Feb 25 11:50:00 crc kubenswrapper[4725]: E0225 11:50:00.143385 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f49a014-2668-4dc4-bf75-c7c41a71c209" containerName="oc" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.143417 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f49a014-2668-4dc4-bf75-c7c41a71c209" containerName="oc" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.143919 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f49a014-2668-4dc4-bf75-c7c41a71c209" containerName="oc" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.145284 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533670-klrw7" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.148183 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.149366 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.149974 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.153681 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533670-klrw7"] Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.227192 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvgxq\" (UniqueName: \"kubernetes.io/projected/c4d6f332-ef19-43b5-8fb7-566663952b6c-kube-api-access-kvgxq\") pod \"auto-csr-approver-29533670-klrw7\" (UID: \"c4d6f332-ef19-43b5-8fb7-566663952b6c\") " pod="openshift-infra/auto-csr-approver-29533670-klrw7" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.329347 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvgxq\" (UniqueName: \"kubernetes.io/projected/c4d6f332-ef19-43b5-8fb7-566663952b6c-kube-api-access-kvgxq\") pod \"auto-csr-approver-29533670-klrw7\" (UID: \"c4d6f332-ef19-43b5-8fb7-566663952b6c\") " pod="openshift-infra/auto-csr-approver-29533670-klrw7" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.346225 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvgxq\" (UniqueName: \"kubernetes.io/projected/c4d6f332-ef19-43b5-8fb7-566663952b6c-kube-api-access-kvgxq\") pod \"auto-csr-approver-29533670-klrw7\" (UID: \"c4d6f332-ef19-43b5-8fb7-566663952b6c\") " pod="openshift-infra/auto-csr-approver-29533670-klrw7" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.476278 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533670-klrw7" Feb 25 11:50:00 crc kubenswrapper[4725]: I0225 11:50:00.927623 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533670-klrw7"] Feb 25 11:50:01 crc kubenswrapper[4725]: I0225 11:50:01.409071 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533670-klrw7" event={"ID":"c4d6f332-ef19-43b5-8fb7-566663952b6c","Type":"ContainerStarted","Data":"1a160ee68c5febf3c0341ab53f37602d8423de11ab73b58298cad805d4cca3ae"} Feb 25 11:50:02 crc kubenswrapper[4725]: I0225 11:50:02.434330 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533670-klrw7" event={"ID":"c4d6f332-ef19-43b5-8fb7-566663952b6c","Type":"ContainerStarted","Data":"fde251ff53c95f8cda1be1c9280fad856e3207095c23855ce8674dc5c2b711ba"} Feb 25 11:50:02 crc kubenswrapper[4725]: I0225 11:50:02.459218 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533670-klrw7" podStartSLOduration=1.32434082 podStartE2EDuration="2.459194979s" podCreationTimestamp="2026-02-25 11:50:00 +0000 UTC" firstStartedPulling="2026-02-25 11:50:00.930508999 +0000 UTC m=+3426.429091044" lastFinishedPulling="2026-02-25 11:50:02.065363178 +0000 UTC m=+3427.563945203" observedRunningTime="2026-02-25 11:50:02.449567156 +0000 UTC m=+3427.948149181" watchObservedRunningTime="2026-02-25 11:50:02.459194979 +0000 UTC m=+3427.957777004" Feb 25 11:50:03 crc kubenswrapper[4725]: I0225 11:50:03.447265 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4d6f332-ef19-43b5-8fb7-566663952b6c" containerID="fde251ff53c95f8cda1be1c9280fad856e3207095c23855ce8674dc5c2b711ba" exitCode=0 Feb 25 11:50:03 crc kubenswrapper[4725]: I0225 11:50:03.447333 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533670-klrw7" event={"ID":"c4d6f332-ef19-43b5-8fb7-566663952b6c","Type":"ContainerDied","Data":"fde251ff53c95f8cda1be1c9280fad856e3207095c23855ce8674dc5c2b711ba"} Feb 25 11:50:04 crc kubenswrapper[4725]: I0225 11:50:04.831906 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533670-klrw7" Feb 25 11:50:04 crc kubenswrapper[4725]: I0225 11:50:04.957010 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvgxq\" (UniqueName: \"kubernetes.io/projected/c4d6f332-ef19-43b5-8fb7-566663952b6c-kube-api-access-kvgxq\") pod \"c4d6f332-ef19-43b5-8fb7-566663952b6c\" (UID: \"c4d6f332-ef19-43b5-8fb7-566663952b6c\") " Feb 25 11:50:04 crc kubenswrapper[4725]: I0225 11:50:04.964014 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d6f332-ef19-43b5-8fb7-566663952b6c-kube-api-access-kvgxq" (OuterVolumeSpecName: "kube-api-access-kvgxq") pod "c4d6f332-ef19-43b5-8fb7-566663952b6c" (UID: "c4d6f332-ef19-43b5-8fb7-566663952b6c"). InnerVolumeSpecName "kube-api-access-kvgxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:50:05 crc kubenswrapper[4725]: I0225 11:50:05.059722 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvgxq\" (UniqueName: \"kubernetes.io/projected/c4d6f332-ef19-43b5-8fb7-566663952b6c-kube-api-access-kvgxq\") on node \"crc\" DevicePath \"\"" Feb 25 11:50:05 crc kubenswrapper[4725]: I0225 11:50:05.468992 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533670-klrw7" event={"ID":"c4d6f332-ef19-43b5-8fb7-566663952b6c","Type":"ContainerDied","Data":"1a160ee68c5febf3c0341ab53f37602d8423de11ab73b58298cad805d4cca3ae"} Feb 25 11:50:05 crc kubenswrapper[4725]: I0225 11:50:05.469048 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a160ee68c5febf3c0341ab53f37602d8423de11ab73b58298cad805d4cca3ae" Feb 25 11:50:05 crc kubenswrapper[4725]: I0225 11:50:05.469071 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533670-klrw7" Feb 25 11:50:05 crc kubenswrapper[4725]: I0225 11:50:05.526725 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533664-4l5bj"] Feb 25 11:50:05 crc kubenswrapper[4725]: I0225 11:50:05.534766 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533664-4l5bj"] Feb 25 11:50:07 crc kubenswrapper[4725]: I0225 11:50:07.240018 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31181682-fa6c-417c-a2ba-08842aeca089" path="/var/lib/kubelet/pods/31181682-fa6c-417c-a2ba-08842aeca089/volumes" Feb 25 11:50:21 crc kubenswrapper[4725]: I0225 11:50:21.887621 4725 scope.go:117] "RemoveContainer" containerID="8d8ed389819106127cc3bb70ddd24ce9497d1a930b889c31e1f14844b5aeee02" Feb 25 11:50:41 crc kubenswrapper[4725]: I0225 11:50:41.555399 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:50:41 crc kubenswrapper[4725]: I0225 11:50:41.556160 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.709530 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v9xtd"] Feb 25 11:50:43 crc kubenswrapper[4725]: E0225 11:50:43.710396 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d6f332-ef19-43b5-8fb7-566663952b6c" containerName="oc" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.710435 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d6f332-ef19-43b5-8fb7-566663952b6c" containerName="oc" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.710697 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d6f332-ef19-43b5-8fb7-566663952b6c" containerName="oc" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.712406 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.720168 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9xtd"] Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.836512 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkr5x\" (UniqueName: \"kubernetes.io/projected/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-kube-api-access-dkr5x\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.836813 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-catalog-content\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.836988 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-utilities\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.938211 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-utilities\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.938294 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkr5x\" (UniqueName: \"kubernetes.io/projected/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-kube-api-access-dkr5x\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.938375 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-catalog-content\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.939116 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-utilities\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.939147 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-catalog-content\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:43 crc kubenswrapper[4725]: I0225 11:50:43.964744 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkr5x\" (UniqueName: \"kubernetes.io/projected/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-kube-api-access-dkr5x\") pod \"community-operators-v9xtd\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:44 crc kubenswrapper[4725]: I0225 11:50:44.033239 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:44 crc kubenswrapper[4725]: I0225 11:50:44.610400 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9xtd"] Feb 25 11:50:44 crc kubenswrapper[4725]: I0225 11:50:44.845164 4725 generic.go:334] "Generic (PLEG): container finished" podID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerID="4a22099ff549ac1edb2b585a67b2e388e54f4c17faa438475b4f39645c2df7db" exitCode=0 Feb 25 11:50:44 crc kubenswrapper[4725]: I0225 11:50:44.845245 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9xtd" event={"ID":"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d","Type":"ContainerDied","Data":"4a22099ff549ac1edb2b585a67b2e388e54f4c17faa438475b4f39645c2df7db"} Feb 25 11:50:44 crc kubenswrapper[4725]: I0225 11:50:44.845310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9xtd" event={"ID":"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d","Type":"ContainerStarted","Data":"3027650d80655ce5fe6b0b6c70304b87ca54ebd468c60f1d11293f1ceb2c8e4a"} Feb 25 11:50:44 crc kubenswrapper[4725]: I0225 11:50:44.848028 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:50:46 crc kubenswrapper[4725]: I0225 11:50:46.865416 4725 generic.go:334] "Generic (PLEG): container finished" podID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerID="76b15b5d8f1b27e122ecbea2e07f552dc012cf8db55e4294cd4a7414a6675c77" exitCode=0 Feb 25 11:50:46 crc kubenswrapper[4725]: I0225 11:50:46.865495 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9xtd" event={"ID":"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d","Type":"ContainerDied","Data":"76b15b5d8f1b27e122ecbea2e07f552dc012cf8db55e4294cd4a7414a6675c77"} Feb 25 11:50:47 crc kubenswrapper[4725]: I0225 11:50:47.878223 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9xtd" event={"ID":"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d","Type":"ContainerStarted","Data":"56526f9981eb625dd6e8de39a5940864212b93e636d1c8c94b002f6451486912"} Feb 25 11:50:47 crc kubenswrapper[4725]: I0225 11:50:47.897671 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v9xtd" podStartSLOduration=2.4836019990000002 podStartE2EDuration="4.897651491s" podCreationTimestamp="2026-02-25 11:50:43 +0000 UTC" firstStartedPulling="2026-02-25 11:50:44.847610896 +0000 UTC m=+3470.346192921" lastFinishedPulling="2026-02-25 11:50:47.261660378 +0000 UTC m=+3472.760242413" observedRunningTime="2026-02-25 11:50:47.896071939 +0000 UTC m=+3473.394653994" watchObservedRunningTime="2026-02-25 11:50:47.897651491 +0000 UTC m=+3473.396233516" Feb 25 11:50:54 crc kubenswrapper[4725]: I0225 11:50:54.034017 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:54 crc kubenswrapper[4725]: I0225 11:50:54.056033 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:54 crc kubenswrapper[4725]: I0225 11:50:54.110267 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:55 crc kubenswrapper[4725]: I0225 11:50:55.118359 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:55 crc kubenswrapper[4725]: I0225 11:50:55.174227 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v9xtd"] Feb 25 11:50:57 crc kubenswrapper[4725]: I0225 11:50:57.069797 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v9xtd" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="registry-server" containerID="cri-o://56526f9981eb625dd6e8de39a5940864212b93e636d1c8c94b002f6451486912" gracePeriod=2 Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.087970 4725 generic.go:334] "Generic (PLEG): container finished" podID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerID="56526f9981eb625dd6e8de39a5940864212b93e636d1c8c94b002f6451486912" exitCode=0 Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.088035 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9xtd" event={"ID":"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d","Type":"ContainerDied","Data":"56526f9981eb625dd6e8de39a5940864212b93e636d1c8c94b002f6451486912"} Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.398657 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.532188 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-utilities\") pod \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.532275 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkr5x\" (UniqueName: \"kubernetes.io/projected/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-kube-api-access-dkr5x\") pod \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.532332 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-catalog-content\") pod \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\" (UID: \"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d\") " Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.533378 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-utilities" (OuterVolumeSpecName: "utilities") pod "3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" (UID: "3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.538473 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-kube-api-access-dkr5x" (OuterVolumeSpecName: "kube-api-access-dkr5x") pod "3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" (UID: "3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d"). InnerVolumeSpecName "kube-api-access-dkr5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.584122 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" (UID: "3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.636151 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.636196 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkr5x\" (UniqueName: \"kubernetes.io/projected/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-kube-api-access-dkr5x\") on node \"crc\" DevicePath \"\"" Feb 25 11:50:59 crc kubenswrapper[4725]: I0225 11:50:59.636208 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:51:00 crc kubenswrapper[4725]: I0225 11:51:00.099313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9xtd" event={"ID":"3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d","Type":"ContainerDied","Data":"3027650d80655ce5fe6b0b6c70304b87ca54ebd468c60f1d11293f1ceb2c8e4a"} Feb 25 11:51:00 crc kubenswrapper[4725]: I0225 11:51:00.099368 4725 scope.go:117] "RemoveContainer" containerID="56526f9981eb625dd6e8de39a5940864212b93e636d1c8c94b002f6451486912" Feb 25 11:51:00 crc kubenswrapper[4725]: I0225 11:51:00.099406 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9xtd" Feb 25 11:51:00 crc kubenswrapper[4725]: I0225 11:51:00.123178 4725 scope.go:117] "RemoveContainer" containerID="76b15b5d8f1b27e122ecbea2e07f552dc012cf8db55e4294cd4a7414a6675c77" Feb 25 11:51:00 crc kubenswrapper[4725]: I0225 11:51:00.139001 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v9xtd"] Feb 25 11:51:00 crc kubenswrapper[4725]: I0225 11:51:00.149197 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v9xtd"] Feb 25 11:51:00 crc kubenswrapper[4725]: I0225 11:51:00.174200 4725 scope.go:117] "RemoveContainer" containerID="4a22099ff549ac1edb2b585a67b2e388e54f4c17faa438475b4f39645c2df7db" Feb 25 11:51:01 crc kubenswrapper[4725]: I0225 11:51:01.237526 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" path="/var/lib/kubelet/pods/3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d/volumes" Feb 25 11:51:11 crc kubenswrapper[4725]: I0225 11:51:11.556130 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:51:11 crc kubenswrapper[4725]: I0225 11:51:11.556684 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:51:41 crc kubenswrapper[4725]: I0225 11:51:41.555937 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:51:41 crc kubenswrapper[4725]: I0225 11:51:41.556572 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:51:41 crc kubenswrapper[4725]: I0225 11:51:41.556619 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:51:41 crc kubenswrapper[4725]: I0225 11:51:41.557351 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55972f279b171bec5e6d0dee8be26569a49cd30b83e5c71721b156cab7b1e025"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:51:41 crc kubenswrapper[4725]: I0225 11:51:41.557411 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://55972f279b171bec5e6d0dee8be26569a49cd30b83e5c71721b156cab7b1e025" gracePeriod=600 Feb 25 11:51:42 crc kubenswrapper[4725]: I0225 11:51:42.526476 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="55972f279b171bec5e6d0dee8be26569a49cd30b83e5c71721b156cab7b1e025" exitCode=0 Feb 25 11:51:42 crc kubenswrapper[4725]: I0225 11:51:42.526560 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"55972f279b171bec5e6d0dee8be26569a49cd30b83e5c71721b156cab7b1e025"} Feb 25 11:51:42 crc kubenswrapper[4725]: I0225 11:51:42.526894 4725 scope.go:117] "RemoveContainer" containerID="92b7c4497b61bb19b37074c746d01774dde3d7dd19f8988a1293f3a0b3b89797" Feb 25 11:51:43 crc kubenswrapper[4725]: I0225 11:51:43.545480 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c"} Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.160035 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533672-q4ngr"] Feb 25 11:52:00 crc kubenswrapper[4725]: E0225 11:52:00.161052 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="registry-server" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.161068 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="registry-server" Feb 25 11:52:00 crc kubenswrapper[4725]: E0225 11:52:00.161093 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="extract-utilities" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.161100 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="extract-utilities" Feb 25 11:52:00 crc kubenswrapper[4725]: E0225 11:52:00.161127 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="extract-content" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.161134 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="extract-content" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.161337 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9d521a-6555-4ca7-ba55-8d7ac1e32e7d" containerName="registry-server" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.161925 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533672-q4ngr" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.164278 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.164413 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.165397 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.170413 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533672-q4ngr"] Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.266499 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvpr2\" (UniqueName: \"kubernetes.io/projected/26eb5fbb-a4eb-44d4-8628-aedbcdae0bff-kube-api-access-gvpr2\") pod \"auto-csr-approver-29533672-q4ngr\" (UID: \"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff\") " pod="openshift-infra/auto-csr-approver-29533672-q4ngr" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.369005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvpr2\" (UniqueName: \"kubernetes.io/projected/26eb5fbb-a4eb-44d4-8628-aedbcdae0bff-kube-api-access-gvpr2\") pod \"auto-csr-approver-29533672-q4ngr\" (UID: \"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff\") " pod="openshift-infra/auto-csr-approver-29533672-q4ngr" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.397121 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvpr2\" (UniqueName: \"kubernetes.io/projected/26eb5fbb-a4eb-44d4-8628-aedbcdae0bff-kube-api-access-gvpr2\") pod \"auto-csr-approver-29533672-q4ngr\" (UID: \"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff\") " pod="openshift-infra/auto-csr-approver-29533672-q4ngr" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.490747 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533672-q4ngr" Feb 25 11:52:00 crc kubenswrapper[4725]: I0225 11:52:00.954292 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533672-q4ngr"] Feb 25 11:52:01 crc kubenswrapper[4725]: I0225 11:52:01.699892 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533672-q4ngr" event={"ID":"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff","Type":"ContainerStarted","Data":"af1fff7be786b1f91b0a812b25c1b32f6eb5784a2c1b60f7ee9b2d70543b1087"} Feb 25 11:52:03 crc kubenswrapper[4725]: I0225 11:52:03.721548 4725 generic.go:334] "Generic (PLEG): container finished" podID="26eb5fbb-a4eb-44d4-8628-aedbcdae0bff" containerID="246e8dd9fcfe2546bf59e4ae870c86db0d3e194ea742e37cae5875462787707c" exitCode=0 Feb 25 11:52:03 crc kubenswrapper[4725]: I0225 11:52:03.721610 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533672-q4ngr" event={"ID":"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff","Type":"ContainerDied","Data":"246e8dd9fcfe2546bf59e4ae870c86db0d3e194ea742e37cae5875462787707c"} Feb 25 11:52:05 crc kubenswrapper[4725]: I0225 11:52:05.141334 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533672-q4ngr" Feb 25 11:52:05 crc kubenswrapper[4725]: I0225 11:52:05.282574 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvpr2\" (UniqueName: \"kubernetes.io/projected/26eb5fbb-a4eb-44d4-8628-aedbcdae0bff-kube-api-access-gvpr2\") pod \"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff\" (UID: \"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff\") " Feb 25 11:52:05 crc kubenswrapper[4725]: I0225 11:52:05.301062 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26eb5fbb-a4eb-44d4-8628-aedbcdae0bff-kube-api-access-gvpr2" (OuterVolumeSpecName: "kube-api-access-gvpr2") pod "26eb5fbb-a4eb-44d4-8628-aedbcdae0bff" (UID: "26eb5fbb-a4eb-44d4-8628-aedbcdae0bff"). InnerVolumeSpecName "kube-api-access-gvpr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:52:05 crc kubenswrapper[4725]: I0225 11:52:05.385640 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvpr2\" (UniqueName: \"kubernetes.io/projected/26eb5fbb-a4eb-44d4-8628-aedbcdae0bff-kube-api-access-gvpr2\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:05 crc kubenswrapper[4725]: I0225 11:52:05.753365 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533672-q4ngr" event={"ID":"26eb5fbb-a4eb-44d4-8628-aedbcdae0bff","Type":"ContainerDied","Data":"af1fff7be786b1f91b0a812b25c1b32f6eb5784a2c1b60f7ee9b2d70543b1087"} Feb 25 11:52:05 crc kubenswrapper[4725]: I0225 11:52:05.753410 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af1fff7be786b1f91b0a812b25c1b32f6eb5784a2c1b60f7ee9b2d70543b1087" Feb 25 11:52:05 crc kubenswrapper[4725]: I0225 11:52:05.753481 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533672-q4ngr" Feb 25 11:52:06 crc kubenswrapper[4725]: I0225 11:52:06.223133 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533666-fs596"] Feb 25 11:52:06 crc kubenswrapper[4725]: I0225 11:52:06.231705 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533666-fs596"] Feb 25 11:52:07 crc kubenswrapper[4725]: I0225 11:52:07.243103 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2" path="/var/lib/kubelet/pods/5ff9de35-0fa6-4ecf-93f0-0d24a39cabe2/volumes" Feb 25 11:52:13 crc kubenswrapper[4725]: I0225 11:52:13.828321 4725 generic.go:334] "Generic (PLEG): container finished" podID="07081f50-997d-4877-be58-a446955dfe62" containerID="fdd421c9f0b75001233241cc39e85b7819903b3d0063b1edd7058faf647b5e2d" exitCode=0 Feb 25 11:52:13 crc kubenswrapper[4725]: I0225 11:52:13.828555 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07081f50-997d-4877-be58-a446955dfe62","Type":"ContainerDied","Data":"fdd421c9f0b75001233241cc39e85b7819903b3d0063b1edd7058faf647b5e2d"} Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.214977 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297290 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-openstack-config-secret\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297613 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-openstack-config\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297717 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297791 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-workdir\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297869 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-temporary\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297909 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-config-data\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297966 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ssh-key\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.297991 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ca-certs\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.298016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mbfb\" (UniqueName: \"kubernetes.io/projected/07081f50-997d-4877-be58-a446955dfe62-kube-api-access-9mbfb\") pod \"07081f50-997d-4877-be58-a446955dfe62\" (UID: \"07081f50-997d-4877-be58-a446955dfe62\") " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.298485 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.298672 4725 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.300232 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-config-data" (OuterVolumeSpecName: "config-data") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.303654 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.304643 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07081f50-997d-4877-be58-a446955dfe62-kube-api-access-9mbfb" (OuterVolumeSpecName: "kube-api-access-9mbfb") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "kube-api-access-9mbfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.304708 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.328770 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.330446 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.340059 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.358785 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "07081f50-997d-4877-be58-a446955dfe62" (UID: "07081f50-997d-4877-be58-a446955dfe62"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400518 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400558 4725 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400572 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mbfb\" (UniqueName: \"kubernetes.io/projected/07081f50-997d-4877-be58-a446955dfe62-kube-api-access-9mbfb\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400586 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07081f50-997d-4877-be58-a446955dfe62-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400598 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400638 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400652 4725 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07081f50-997d-4877-be58-a446955dfe62-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.400667 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07081f50-997d-4877-be58-a446955dfe62-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.425673 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.502433 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.852326 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.852371 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07081f50-997d-4877-be58-a446955dfe62","Type":"ContainerDied","Data":"c4a69b690ae90015e395bc3abccc03b9614cf2aa9c0be63c6b36e0e49993eef8"} Feb 25 11:52:15 crc kubenswrapper[4725]: I0225 11:52:15.852441 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4a69b690ae90015e395bc3abccc03b9614cf2aa9c0be63c6b36e0e49993eef8" Feb 25 11:52:21 crc kubenswrapper[4725]: I0225 11:52:21.993330 4725 scope.go:117] "RemoveContainer" containerID="28ae3e7458ed12bd309f567175023ec4417e9b8f6574d6f2eada4845763b83e3" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.172659 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 11:52:26 crc kubenswrapper[4725]: E0225 11:52:26.173617 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26eb5fbb-a4eb-44d4-8628-aedbcdae0bff" containerName="oc" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.173635 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="26eb5fbb-a4eb-44d4-8628-aedbcdae0bff" containerName="oc" Feb 25 11:52:26 crc kubenswrapper[4725]: E0225 11:52:26.173654 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07081f50-997d-4877-be58-a446955dfe62" containerName="tempest-tests-tempest-tests-runner" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.173662 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="07081f50-997d-4877-be58-a446955dfe62" containerName="tempest-tests-tempest-tests-runner" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.174128 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="07081f50-997d-4877-be58-a446955dfe62" containerName="tempest-tests-tempest-tests-runner" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.174161 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="26eb5fbb-a4eb-44d4-8628-aedbcdae0bff" containerName="oc" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.174737 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.177353 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-4svv6" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.181509 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.304129 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxs7s\" (UniqueName: \"kubernetes.io/projected/6d6cd6ff-f8a8-4cab-b786-90440d19dbf1-kube-api-access-vxs7s\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.304684 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.408738 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxs7s\" (UniqueName: \"kubernetes.io/projected/6d6cd6ff-f8a8-4cab-b786-90440d19dbf1-kube-api-access-vxs7s\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.410378 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.411181 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.433345 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxs7s\" (UniqueName: \"kubernetes.io/projected/6d6cd6ff-f8a8-4cab-b786-90440d19dbf1-kube-api-access-vxs7s\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.458991 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.496463 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 11:52:26 crc kubenswrapper[4725]: I0225 11:52:26.950766 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 11:52:26 crc kubenswrapper[4725]: W0225 11:52:26.952226 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d6cd6ff_f8a8_4cab_b786_90440d19dbf1.slice/crio-4e50c836fd58956c55e6aa10dfc3dfb170721b78dfcfa8465ef4a2155685b02c WatchSource:0}: Error finding container 4e50c836fd58956c55e6aa10dfc3dfb170721b78dfcfa8465ef4a2155685b02c: Status 404 returned error can't find the container with id 4e50c836fd58956c55e6aa10dfc3dfb170721b78dfcfa8465ef4a2155685b02c Feb 25 11:52:27 crc kubenswrapper[4725]: I0225 11:52:27.971083 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1","Type":"ContainerStarted","Data":"4e50c836fd58956c55e6aa10dfc3dfb170721b78dfcfa8465ef4a2155685b02c"} Feb 25 11:52:32 crc kubenswrapper[4725]: I0225 11:52:32.012375 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6d6cd6ff-f8a8-4cab-b786-90440d19dbf1","Type":"ContainerStarted","Data":"27ad98bd8bafa83e1eb6155b01775e34c4eb21bfbae7705c7e6b5649323b75d3"} Feb 25 11:52:32 crc kubenswrapper[4725]: I0225 11:52:32.052203 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.24790924 podStartE2EDuration="6.052169389s" podCreationTimestamp="2026-02-25 11:52:26 +0000 UTC" firstStartedPulling="2026-02-25 11:52:26.954589925 +0000 UTC m=+3572.453171950" lastFinishedPulling="2026-02-25 11:52:30.758850074 +0000 UTC m=+3576.257432099" observedRunningTime="2026-02-25 11:52:32.045756171 +0000 UTC m=+3577.544338226" watchObservedRunningTime="2026-02-25 11:52:32.052169389 +0000 UTC m=+3577.550751444" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.406813 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmjfg/must-gather-jp2kj"] Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.409178 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.413235 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xmjfg"/"kube-root-ca.crt" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.413430 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xmjfg"/"openshift-service-ca.crt" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.413434 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xmjfg"/"default-dockercfg-ghcgp" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.427055 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xmjfg/must-gather-jp2kj"] Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.574552 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54161a52-ee5a-492c-ba0b-2d9292adb410-must-gather-output\") pod \"must-gather-jp2kj\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.574628 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bqrh\" (UniqueName: \"kubernetes.io/projected/54161a52-ee5a-492c-ba0b-2d9292adb410-kube-api-access-5bqrh\") pod \"must-gather-jp2kj\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.676675 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54161a52-ee5a-492c-ba0b-2d9292adb410-must-gather-output\") pod \"must-gather-jp2kj\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.676744 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bqrh\" (UniqueName: \"kubernetes.io/projected/54161a52-ee5a-492c-ba0b-2d9292adb410-kube-api-access-5bqrh\") pod \"must-gather-jp2kj\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.677187 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54161a52-ee5a-492c-ba0b-2d9292adb410-must-gather-output\") pod \"must-gather-jp2kj\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.694819 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bqrh\" (UniqueName: \"kubernetes.io/projected/54161a52-ee5a-492c-ba0b-2d9292adb410-kube-api-access-5bqrh\") pod \"must-gather-jp2kj\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:54 crc kubenswrapper[4725]: I0225 11:52:54.728392 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:52:55 crc kubenswrapper[4725]: I0225 11:52:55.203168 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xmjfg/must-gather-jp2kj"] Feb 25 11:52:55 crc kubenswrapper[4725]: I0225 11:52:55.258435 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" event={"ID":"54161a52-ee5a-492c-ba0b-2d9292adb410","Type":"ContainerStarted","Data":"206704d07bafcc4f54084d9c0625ca6278b467a42ba72b4136bc420c57e4bed3"} Feb 25 11:53:03 crc kubenswrapper[4725]: I0225 11:53:03.341473 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" event={"ID":"54161a52-ee5a-492c-ba0b-2d9292adb410","Type":"ContainerStarted","Data":"5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca"} Feb 25 11:53:03 crc kubenswrapper[4725]: I0225 11:53:03.342210 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" event={"ID":"54161a52-ee5a-492c-ba0b-2d9292adb410","Type":"ContainerStarted","Data":"3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f"} Feb 25 11:53:03 crc kubenswrapper[4725]: I0225 11:53:03.381564 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" podStartSLOduration=2.223974768 podStartE2EDuration="9.381537897s" podCreationTimestamp="2026-02-25 11:52:54 +0000 UTC" firstStartedPulling="2026-02-25 11:52:55.204096205 +0000 UTC m=+3600.702678220" lastFinishedPulling="2026-02-25 11:53:02.361659324 +0000 UTC m=+3607.860241349" observedRunningTime="2026-02-25 11:53:03.373727542 +0000 UTC m=+3608.872309577" watchObservedRunningTime="2026-02-25 11:53:03.381537897 +0000 UTC m=+3608.880119922" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.300777 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-nvrnn"] Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.302660 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.403709 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea0b08b9-c4ac-4c79-a161-856990173656-host\") pod \"crc-debug-nvrnn\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.403786 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/ea0b08b9-c4ac-4c79-a161-856990173656-kube-api-access-gcmkq\") pod \"crc-debug-nvrnn\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.506055 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea0b08b9-c4ac-4c79-a161-856990173656-host\") pod \"crc-debug-nvrnn\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.506140 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/ea0b08b9-c4ac-4c79-a161-856990173656-kube-api-access-gcmkq\") pod \"crc-debug-nvrnn\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.506201 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea0b08b9-c4ac-4c79-a161-856990173656-host\") pod \"crc-debug-nvrnn\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.533532 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/ea0b08b9-c4ac-4c79-a161-856990173656-kube-api-access-gcmkq\") pod \"crc-debug-nvrnn\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: I0225 11:53:06.627460 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:53:06 crc kubenswrapper[4725]: W0225 11:53:06.685640 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea0b08b9_c4ac_4c79_a161_856990173656.slice/crio-e6c3256f9cce7715ff218f6cb21df3d0909f9757ad5bb0c2d915e8d78159c9c7 WatchSource:0}: Error finding container e6c3256f9cce7715ff218f6cb21df3d0909f9757ad5bb0c2d915e8d78159c9c7: Status 404 returned error can't find the container with id e6c3256f9cce7715ff218f6cb21df3d0909f9757ad5bb0c2d915e8d78159c9c7 Feb 25 11:53:07 crc kubenswrapper[4725]: I0225 11:53:07.384911 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" event={"ID":"ea0b08b9-c4ac-4c79-a161-856990173656","Type":"ContainerStarted","Data":"e6c3256f9cce7715ff218f6cb21df3d0909f9757ad5bb0c2d915e8d78159c9c7"} Feb 25 11:53:20 crc kubenswrapper[4725]: I0225 11:53:20.504212 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" event={"ID":"ea0b08b9-c4ac-4c79-a161-856990173656","Type":"ContainerStarted","Data":"55909d9ea9318598e2b35ba0cda0f1f15b1b1c4953ba08213c9c82fa0619d411"} Feb 25 11:53:20 crc kubenswrapper[4725]: I0225 11:53:20.525581 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" podStartSLOduration=1.705187349 podStartE2EDuration="14.52556186s" podCreationTimestamp="2026-02-25 11:53:06 +0000 UTC" firstStartedPulling="2026-02-25 11:53:06.687545796 +0000 UTC m=+3612.186127821" lastFinishedPulling="2026-02-25 11:53:19.507920307 +0000 UTC m=+3625.006502332" observedRunningTime="2026-02-25 11:53:20.518548116 +0000 UTC m=+3626.017130141" watchObservedRunningTime="2026-02-25 11:53:20.52556186 +0000 UTC m=+3626.024143885" Feb 25 11:53:57 crc kubenswrapper[4725]: I0225 11:53:57.807099 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w5f"] Feb 25 11:53:57 crc kubenswrapper[4725]: I0225 11:53:57.810001 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:57 crc kubenswrapper[4725]: I0225 11:53:57.816723 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w5f"] Feb 25 11:53:57 crc kubenswrapper[4725]: I0225 11:53:57.953185 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfcqp\" (UniqueName: \"kubernetes.io/projected/b76048ec-50e0-47bb-9849-5fe72e887ae0-kube-api-access-bfcqp\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:57 crc kubenswrapper[4725]: I0225 11:53:57.953301 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-catalog-content\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:57 crc kubenswrapper[4725]: I0225 11:53:57.953648 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-utilities\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.055950 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-utilities\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.056047 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcqp\" (UniqueName: \"kubernetes.io/projected/b76048ec-50e0-47bb-9849-5fe72e887ae0-kube-api-access-bfcqp\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.056078 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-catalog-content\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.056406 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-utilities\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.056457 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-catalog-content\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.074797 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcqp\" (UniqueName: \"kubernetes.io/projected/b76048ec-50e0-47bb-9849-5fe72e887ae0-kube-api-access-bfcqp\") pod \"redhat-marketplace-p2w5f\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.141336 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.654680 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w5f"] Feb 25 11:53:58 crc kubenswrapper[4725]: W0225 11:53:58.668016 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb76048ec_50e0_47bb_9849_5fe72e887ae0.slice/crio-76dd7197e97c2e3b63d551ae05af9651c1fe1ff3388e720e5a61372990fbfbbf WatchSource:0}: Error finding container 76dd7197e97c2e3b63d551ae05af9651c1fe1ff3388e720e5a61372990fbfbbf: Status 404 returned error can't find the container with id 76dd7197e97c2e3b63d551ae05af9651c1fe1ff3388e720e5a61372990fbfbbf Feb 25 11:53:58 crc kubenswrapper[4725]: I0225 11:53:58.846445 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w5f" event={"ID":"b76048ec-50e0-47bb-9849-5fe72e887ae0","Type":"ContainerStarted","Data":"76dd7197e97c2e3b63d551ae05af9651c1fe1ff3388e720e5a61372990fbfbbf"} Feb 25 11:53:59 crc kubenswrapper[4725]: I0225 11:53:59.860209 4725 generic.go:334] "Generic (PLEG): container finished" podID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerID="9fe5a01a20e4ff516797de838d1c0d2a3ecb9f82f8f3bf7c7dc8356f978dc27b" exitCode=0 Feb 25 11:53:59 crc kubenswrapper[4725]: I0225 11:53:59.860539 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w5f" event={"ID":"b76048ec-50e0-47bb-9849-5fe72e887ae0","Type":"ContainerDied","Data":"9fe5a01a20e4ff516797de838d1c0d2a3ecb9f82f8f3bf7c7dc8356f978dc27b"} Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.144686 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533674-mgrk8"] Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.146280 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533674-mgrk8" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.150044 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.150578 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.150754 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.160369 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533674-mgrk8"] Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.298700 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28p5n\" (UniqueName: \"kubernetes.io/projected/9bae8743-62db-4ba4-bbe3-6cdc5faf3fab-kube-api-access-28p5n\") pod \"auto-csr-approver-29533674-mgrk8\" (UID: \"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab\") " pod="openshift-infra/auto-csr-approver-29533674-mgrk8" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.400778 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28p5n\" (UniqueName: \"kubernetes.io/projected/9bae8743-62db-4ba4-bbe3-6cdc5faf3fab-kube-api-access-28p5n\") pod \"auto-csr-approver-29533674-mgrk8\" (UID: \"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab\") " pod="openshift-infra/auto-csr-approver-29533674-mgrk8" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.424197 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28p5n\" (UniqueName: \"kubernetes.io/projected/9bae8743-62db-4ba4-bbe3-6cdc5faf3fab-kube-api-access-28p5n\") pod \"auto-csr-approver-29533674-mgrk8\" (UID: \"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab\") " pod="openshift-infra/auto-csr-approver-29533674-mgrk8" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.469640 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533674-mgrk8" Feb 25 11:54:00 crc kubenswrapper[4725]: I0225 11:54:00.983742 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533674-mgrk8"] Feb 25 11:54:01 crc kubenswrapper[4725]: I0225 11:54:01.877599 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533674-mgrk8" event={"ID":"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab","Type":"ContainerStarted","Data":"09d5e941985006ca74645cd279b0cfb954dec44faf97af7ebdbe6177e0fff5a7"} Feb 25 11:54:02 crc kubenswrapper[4725]: I0225 11:54:02.904894 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w5f" event={"ID":"b76048ec-50e0-47bb-9849-5fe72e887ae0","Type":"ContainerStarted","Data":"09138f4105649d7882e72851f2813be548d87d6757b9e8dce2f39380a59306b4"} Feb 25 11:54:03 crc kubenswrapper[4725]: I0225 11:54:03.915758 4725 generic.go:334] "Generic (PLEG): container finished" podID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerID="09138f4105649d7882e72851f2813be548d87d6757b9e8dce2f39380a59306b4" exitCode=0 Feb 25 11:54:03 crc kubenswrapper[4725]: I0225 11:54:03.915880 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w5f" event={"ID":"b76048ec-50e0-47bb-9849-5fe72e887ae0","Type":"ContainerDied","Data":"09138f4105649d7882e72851f2813be548d87d6757b9e8dce2f39380a59306b4"} Feb 25 11:54:03 crc kubenswrapper[4725]: I0225 11:54:03.916463 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w5f" event={"ID":"b76048ec-50e0-47bb-9849-5fe72e887ae0","Type":"ContainerStarted","Data":"37760fd477badc1e89b39eb0e039ae82d336e26fdddfb6739b58c03dc4c3e8d6"} Feb 25 11:54:03 crc kubenswrapper[4725]: I0225 11:54:03.918086 4725 generic.go:334] "Generic (PLEG): container finished" podID="9bae8743-62db-4ba4-bbe3-6cdc5faf3fab" containerID="a714fb5b0fbbd6eb1c72092440acbefab30f49029438acceed22e82f3f0b6ac3" exitCode=0 Feb 25 11:54:03 crc kubenswrapper[4725]: I0225 11:54:03.918122 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533674-mgrk8" event={"ID":"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab","Type":"ContainerDied","Data":"a714fb5b0fbbd6eb1c72092440acbefab30f49029438acceed22e82f3f0b6ac3"} Feb 25 11:54:03 crc kubenswrapper[4725]: I0225 11:54:03.934859 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p2w5f" podStartSLOduration=3.315466916 podStartE2EDuration="6.934821845s" podCreationTimestamp="2026-02-25 11:53:57 +0000 UTC" firstStartedPulling="2026-02-25 11:53:59.864133547 +0000 UTC m=+3665.362715572" lastFinishedPulling="2026-02-25 11:54:03.483488476 +0000 UTC m=+3668.982070501" observedRunningTime="2026-02-25 11:54:03.932479693 +0000 UTC m=+3669.431061718" watchObservedRunningTime="2026-02-25 11:54:03.934821845 +0000 UTC m=+3669.433403880" Feb 25 11:54:05 crc kubenswrapper[4725]: I0225 11:54:05.298098 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533674-mgrk8" Feb 25 11:54:05 crc kubenswrapper[4725]: I0225 11:54:05.390598 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28p5n\" (UniqueName: \"kubernetes.io/projected/9bae8743-62db-4ba4-bbe3-6cdc5faf3fab-kube-api-access-28p5n\") pod \"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab\" (UID: \"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab\") " Feb 25 11:54:05 crc kubenswrapper[4725]: I0225 11:54:05.403619 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bae8743-62db-4ba4-bbe3-6cdc5faf3fab-kube-api-access-28p5n" (OuterVolumeSpecName: "kube-api-access-28p5n") pod "9bae8743-62db-4ba4-bbe3-6cdc5faf3fab" (UID: "9bae8743-62db-4ba4-bbe3-6cdc5faf3fab"). InnerVolumeSpecName "kube-api-access-28p5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:54:05 crc kubenswrapper[4725]: I0225 11:54:05.492645 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28p5n\" (UniqueName: \"kubernetes.io/projected/9bae8743-62db-4ba4-bbe3-6cdc5faf3fab-kube-api-access-28p5n\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:05 crc kubenswrapper[4725]: I0225 11:54:05.936334 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533674-mgrk8" event={"ID":"9bae8743-62db-4ba4-bbe3-6cdc5faf3fab","Type":"ContainerDied","Data":"09d5e941985006ca74645cd279b0cfb954dec44faf97af7ebdbe6177e0fff5a7"} Feb 25 11:54:05 crc kubenswrapper[4725]: I0225 11:54:05.936626 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d5e941985006ca74645cd279b0cfb954dec44faf97af7ebdbe6177e0fff5a7" Feb 25 11:54:05 crc kubenswrapper[4725]: I0225 11:54:05.936387 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533674-mgrk8" Feb 25 11:54:06 crc kubenswrapper[4725]: I0225 11:54:06.371623 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533668-q9582"] Feb 25 11:54:06 crc kubenswrapper[4725]: I0225 11:54:06.381502 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533668-q9582"] Feb 25 11:54:07 crc kubenswrapper[4725]: I0225 11:54:07.243324 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f49a014-2668-4dc4-bf75-c7c41a71c209" path="/var/lib/kubelet/pods/9f49a014-2668-4dc4-bf75-c7c41a71c209/volumes" Feb 25 11:54:08 crc kubenswrapper[4725]: I0225 11:54:08.141778 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:54:08 crc kubenswrapper[4725]: I0225 11:54:08.141821 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:54:08 crc kubenswrapper[4725]: I0225 11:54:08.199200 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:54:09 crc kubenswrapper[4725]: I0225 11:54:09.020662 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:54:09 crc kubenswrapper[4725]: I0225 11:54:09.188105 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w5f"] Feb 25 11:54:10 crc kubenswrapper[4725]: I0225 11:54:10.980340 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p2w5f" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="registry-server" containerID="cri-o://37760fd477badc1e89b39eb0e039ae82d336e26fdddfb6739b58c03dc4c3e8d6" gracePeriod=2 Feb 25 11:54:11 crc kubenswrapper[4725]: I0225 11:54:11.555728 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:54:11 crc kubenswrapper[4725]: I0225 11:54:11.556033 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.005036 4725 generic.go:334] "Generic (PLEG): container finished" podID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerID="37760fd477badc1e89b39eb0e039ae82d336e26fdddfb6739b58c03dc4c3e8d6" exitCode=0 Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.005092 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w5f" event={"ID":"b76048ec-50e0-47bb-9849-5fe72e887ae0","Type":"ContainerDied","Data":"37760fd477badc1e89b39eb0e039ae82d336e26fdddfb6739b58c03dc4c3e8d6"} Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.005132 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w5f" event={"ID":"b76048ec-50e0-47bb-9849-5fe72e887ae0","Type":"ContainerDied","Data":"76dd7197e97c2e3b63d551ae05af9651c1fe1ff3388e720e5a61372990fbfbbf"} Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.005149 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dd7197e97c2e3b63d551ae05af9651c1fe1ff3388e720e5a61372990fbfbbf" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.007220 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.121455 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-utilities\") pod \"b76048ec-50e0-47bb-9849-5fe72e887ae0\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.121604 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfcqp\" (UniqueName: \"kubernetes.io/projected/b76048ec-50e0-47bb-9849-5fe72e887ae0-kube-api-access-bfcqp\") pod \"b76048ec-50e0-47bb-9849-5fe72e887ae0\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.121685 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-catalog-content\") pod \"b76048ec-50e0-47bb-9849-5fe72e887ae0\" (UID: \"b76048ec-50e0-47bb-9849-5fe72e887ae0\") " Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.122929 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-utilities" (OuterVolumeSpecName: "utilities") pod "b76048ec-50e0-47bb-9849-5fe72e887ae0" (UID: "b76048ec-50e0-47bb-9849-5fe72e887ae0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.129425 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76048ec-50e0-47bb-9849-5fe72e887ae0-kube-api-access-bfcqp" (OuterVolumeSpecName: "kube-api-access-bfcqp") pod "b76048ec-50e0-47bb-9849-5fe72e887ae0" (UID: "b76048ec-50e0-47bb-9849-5fe72e887ae0"). InnerVolumeSpecName "kube-api-access-bfcqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.148369 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b76048ec-50e0-47bb-9849-5fe72e887ae0" (UID: "b76048ec-50e0-47bb-9849-5fe72e887ae0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.225071 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfcqp\" (UniqueName: \"kubernetes.io/projected/b76048ec-50e0-47bb-9849-5fe72e887ae0-kube-api-access-bfcqp\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.225220 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:12 crc kubenswrapper[4725]: I0225 11:54:12.225251 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76048ec-50e0-47bb-9849-5fe72e887ae0-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:13 crc kubenswrapper[4725]: I0225 11:54:13.017441 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w5f" Feb 25 11:54:13 crc kubenswrapper[4725]: I0225 11:54:13.060616 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w5f"] Feb 25 11:54:13 crc kubenswrapper[4725]: I0225 11:54:13.069508 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w5f"] Feb 25 11:54:13 crc kubenswrapper[4725]: I0225 11:54:13.237073 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" path="/var/lib/kubelet/pods/b76048ec-50e0-47bb-9849-5fe72e887ae0/volumes" Feb 25 11:54:20 crc kubenswrapper[4725]: I0225 11:54:20.082454 4725 generic.go:334] "Generic (PLEG): container finished" podID="ea0b08b9-c4ac-4c79-a161-856990173656" containerID="55909d9ea9318598e2b35ba0cda0f1f15b1b1c4953ba08213c9c82fa0619d411" exitCode=0 Feb 25 11:54:20 crc kubenswrapper[4725]: I0225 11:54:20.082558 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" event={"ID":"ea0b08b9-c4ac-4c79-a161-856990173656","Type":"ContainerDied","Data":"55909d9ea9318598e2b35ba0cda0f1f15b1b1c4953ba08213c9c82fa0619d411"} Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.218475 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.269489 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-nvrnn"] Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.277449 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-nvrnn"] Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.335687 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea0b08b9-c4ac-4c79-a161-856990173656-host\") pod \"ea0b08b9-c4ac-4c79-a161-856990173656\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.335816 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea0b08b9-c4ac-4c79-a161-856990173656-host" (OuterVolumeSpecName: "host") pod "ea0b08b9-c4ac-4c79-a161-856990173656" (UID: "ea0b08b9-c4ac-4c79-a161-856990173656"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.335857 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/ea0b08b9-c4ac-4c79-a161-856990173656-kube-api-access-gcmkq\") pod \"ea0b08b9-c4ac-4c79-a161-856990173656\" (UID: \"ea0b08b9-c4ac-4c79-a161-856990173656\") " Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.336496 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea0b08b9-c4ac-4c79-a161-856990173656-host\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.342549 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0b08b9-c4ac-4c79-a161-856990173656-kube-api-access-gcmkq" (OuterVolumeSpecName: "kube-api-access-gcmkq") pod "ea0b08b9-c4ac-4c79-a161-856990173656" (UID: "ea0b08b9-c4ac-4c79-a161-856990173656"). InnerVolumeSpecName "kube-api-access-gcmkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:54:21 crc kubenswrapper[4725]: I0225 11:54:21.438783 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/ea0b08b9-c4ac-4c79-a161-856990173656-kube-api-access-gcmkq\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.077945 4725 scope.go:117] "RemoveContainer" containerID="579087ef6d0d0dadc41f59207ebb8127e2674b95834ca36ce5ba7ce3b109a51d" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.117594 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6c3256f9cce7715ff218f6cb21df3d0909f9757ad5bb0c2d915e8d78159c9c7" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.117803 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-nvrnn" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.472919 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-9bjz9"] Feb 25 11:54:22 crc kubenswrapper[4725]: E0225 11:54:22.473454 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="registry-server" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473472 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="registry-server" Feb 25 11:54:22 crc kubenswrapper[4725]: E0225 11:54:22.473494 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="extract-utilities" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473503 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="extract-utilities" Feb 25 11:54:22 crc kubenswrapper[4725]: E0225 11:54:22.473525 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0b08b9-c4ac-4c79-a161-856990173656" containerName="container-00" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473533 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0b08b9-c4ac-4c79-a161-856990173656" containerName="container-00" Feb 25 11:54:22 crc kubenswrapper[4725]: E0225 11:54:22.473549 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bae8743-62db-4ba4-bbe3-6cdc5faf3fab" containerName="oc" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473557 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bae8743-62db-4ba4-bbe3-6cdc5faf3fab" containerName="oc" Feb 25 11:54:22 crc kubenswrapper[4725]: E0225 11:54:22.473571 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="extract-content" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473581 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="extract-content" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473867 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bae8743-62db-4ba4-bbe3-6cdc5faf3fab" containerName="oc" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473893 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0b08b9-c4ac-4c79-a161-856990173656" containerName="container-00" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.473913 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76048ec-50e0-47bb-9849-5fe72e887ae0" containerName="registry-server" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.474746 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.558596 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq6qh\" (UniqueName: \"kubernetes.io/projected/365a52ea-84d2-4e73-942d-08fe2a6825e0-kube-api-access-pq6qh\") pod \"crc-debug-9bjz9\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.558684 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365a52ea-84d2-4e73-942d-08fe2a6825e0-host\") pod \"crc-debug-9bjz9\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.660902 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq6qh\" (UniqueName: \"kubernetes.io/projected/365a52ea-84d2-4e73-942d-08fe2a6825e0-kube-api-access-pq6qh\") pod \"crc-debug-9bjz9\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.661320 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365a52ea-84d2-4e73-942d-08fe2a6825e0-host\") pod \"crc-debug-9bjz9\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.661540 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365a52ea-84d2-4e73-942d-08fe2a6825e0-host\") pod \"crc-debug-9bjz9\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.680322 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq6qh\" (UniqueName: \"kubernetes.io/projected/365a52ea-84d2-4e73-942d-08fe2a6825e0-kube-api-access-pq6qh\") pod \"crc-debug-9bjz9\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:22 crc kubenswrapper[4725]: I0225 11:54:22.792737 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:23 crc kubenswrapper[4725]: I0225 11:54:23.126247 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" event={"ID":"365a52ea-84d2-4e73-942d-08fe2a6825e0","Type":"ContainerStarted","Data":"763aae95e51b05ecf2db8bd941228401df069fc46413010c5b05212a7f794b4b"} Feb 25 11:54:23 crc kubenswrapper[4725]: I0225 11:54:23.241430 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0b08b9-c4ac-4c79-a161-856990173656" path="/var/lib/kubelet/pods/ea0b08b9-c4ac-4c79-a161-856990173656/volumes" Feb 25 11:54:24 crc kubenswrapper[4725]: I0225 11:54:24.137663 4725 generic.go:334] "Generic (PLEG): container finished" podID="365a52ea-84d2-4e73-942d-08fe2a6825e0" containerID="fd1984160a0028bee00ef10a35913cf14e864623bd671a81dbe79153d576f1c8" exitCode=0 Feb 25 11:54:24 crc kubenswrapper[4725]: I0225 11:54:24.137723 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" event={"ID":"365a52ea-84d2-4e73-942d-08fe2a6825e0","Type":"ContainerDied","Data":"fd1984160a0028bee00ef10a35913cf14e864623bd671a81dbe79153d576f1c8"} Feb 25 11:54:24 crc kubenswrapper[4725]: I0225 11:54:24.704817 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-9bjz9"] Feb 25 11:54:24 crc kubenswrapper[4725]: I0225 11:54:24.714517 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-9bjz9"] Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.288483 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.416131 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq6qh\" (UniqueName: \"kubernetes.io/projected/365a52ea-84d2-4e73-942d-08fe2a6825e0-kube-api-access-pq6qh\") pod \"365a52ea-84d2-4e73-942d-08fe2a6825e0\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.416293 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365a52ea-84d2-4e73-942d-08fe2a6825e0-host\") pod \"365a52ea-84d2-4e73-942d-08fe2a6825e0\" (UID: \"365a52ea-84d2-4e73-942d-08fe2a6825e0\") " Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.416418 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/365a52ea-84d2-4e73-942d-08fe2a6825e0-host" (OuterVolumeSpecName: "host") pod "365a52ea-84d2-4e73-942d-08fe2a6825e0" (UID: "365a52ea-84d2-4e73-942d-08fe2a6825e0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.417109 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365a52ea-84d2-4e73-942d-08fe2a6825e0-host\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.440114 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365a52ea-84d2-4e73-942d-08fe2a6825e0-kube-api-access-pq6qh" (OuterVolumeSpecName: "kube-api-access-pq6qh") pod "365a52ea-84d2-4e73-942d-08fe2a6825e0" (UID: "365a52ea-84d2-4e73-942d-08fe2a6825e0"). InnerVolumeSpecName "kube-api-access-pq6qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.519078 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq6qh\" (UniqueName: \"kubernetes.io/projected/365a52ea-84d2-4e73-942d-08fe2a6825e0-kube-api-access-pq6qh\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.910021 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-z4h6g"] Feb 25 11:54:25 crc kubenswrapper[4725]: E0225 11:54:25.910384 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365a52ea-84d2-4e73-942d-08fe2a6825e0" containerName="container-00" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.910396 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="365a52ea-84d2-4e73-942d-08fe2a6825e0" containerName="container-00" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.910597 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="365a52ea-84d2-4e73-942d-08fe2a6825e0" containerName="container-00" Feb 25 11:54:25 crc kubenswrapper[4725]: I0225 11:54:25.911229 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.028371 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2qwz\" (UniqueName: \"kubernetes.io/projected/38102000-b649-4560-8bfb-1d734ebffa09-kube-api-access-q2qwz\") pod \"crc-debug-z4h6g\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.028451 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38102000-b649-4560-8bfb-1d734ebffa09-host\") pod \"crc-debug-z4h6g\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.130403 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2qwz\" (UniqueName: \"kubernetes.io/projected/38102000-b649-4560-8bfb-1d734ebffa09-kube-api-access-q2qwz\") pod \"crc-debug-z4h6g\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.130489 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38102000-b649-4560-8bfb-1d734ebffa09-host\") pod \"crc-debug-z4h6g\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.130592 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38102000-b649-4560-8bfb-1d734ebffa09-host\") pod \"crc-debug-z4h6g\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.149110 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2qwz\" (UniqueName: \"kubernetes.io/projected/38102000-b649-4560-8bfb-1d734ebffa09-kube-api-access-q2qwz\") pod \"crc-debug-z4h6g\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.157405 4725 scope.go:117] "RemoveContainer" containerID="fd1984160a0028bee00ef10a35913cf14e864623bd671a81dbe79153d576f1c8" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.157571 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-9bjz9" Feb 25 11:54:26 crc kubenswrapper[4725]: I0225 11:54:26.228633 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:27 crc kubenswrapper[4725]: I0225 11:54:27.166668 4725 generic.go:334] "Generic (PLEG): container finished" podID="38102000-b649-4560-8bfb-1d734ebffa09" containerID="5617ab4701f3d10b486304611637959badb74c5ddf9eac76d9364e9d430839aa" exitCode=0 Feb 25 11:54:27 crc kubenswrapper[4725]: I0225 11:54:27.166772 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" event={"ID":"38102000-b649-4560-8bfb-1d734ebffa09","Type":"ContainerDied","Data":"5617ab4701f3d10b486304611637959badb74c5ddf9eac76d9364e9d430839aa"} Feb 25 11:54:27 crc kubenswrapper[4725]: I0225 11:54:27.167076 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" event={"ID":"38102000-b649-4560-8bfb-1d734ebffa09","Type":"ContainerStarted","Data":"244eba105e40fb3e5b031db9283a7d5f5a22a6b0d87922912f4e5be374b22ac2"} Feb 25 11:54:27 crc kubenswrapper[4725]: I0225 11:54:27.208872 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-z4h6g"] Feb 25 11:54:27 crc kubenswrapper[4725]: I0225 11:54:27.217338 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmjfg/crc-debug-z4h6g"] Feb 25 11:54:27 crc kubenswrapper[4725]: I0225 11:54:27.236886 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365a52ea-84d2-4e73-942d-08fe2a6825e0" path="/var/lib/kubelet/pods/365a52ea-84d2-4e73-942d-08fe2a6825e0/volumes" Feb 25 11:54:28 crc kubenswrapper[4725]: I0225 11:54:28.294935 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:28 crc kubenswrapper[4725]: I0225 11:54:28.368873 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2qwz\" (UniqueName: \"kubernetes.io/projected/38102000-b649-4560-8bfb-1d734ebffa09-kube-api-access-q2qwz\") pod \"38102000-b649-4560-8bfb-1d734ebffa09\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " Feb 25 11:54:28 crc kubenswrapper[4725]: I0225 11:54:28.369105 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38102000-b649-4560-8bfb-1d734ebffa09-host\") pod \"38102000-b649-4560-8bfb-1d734ebffa09\" (UID: \"38102000-b649-4560-8bfb-1d734ebffa09\") " Feb 25 11:54:28 crc kubenswrapper[4725]: I0225 11:54:28.369226 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38102000-b649-4560-8bfb-1d734ebffa09-host" (OuterVolumeSpecName: "host") pod "38102000-b649-4560-8bfb-1d734ebffa09" (UID: "38102000-b649-4560-8bfb-1d734ebffa09"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 11:54:28 crc kubenswrapper[4725]: I0225 11:54:28.369779 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38102000-b649-4560-8bfb-1d734ebffa09-host\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:28 crc kubenswrapper[4725]: I0225 11:54:28.374381 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38102000-b649-4560-8bfb-1d734ebffa09-kube-api-access-q2qwz" (OuterVolumeSpecName: "kube-api-access-q2qwz") pod "38102000-b649-4560-8bfb-1d734ebffa09" (UID: "38102000-b649-4560-8bfb-1d734ebffa09"). InnerVolumeSpecName "kube-api-access-q2qwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:54:28 crc kubenswrapper[4725]: I0225 11:54:28.471472 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2qwz\" (UniqueName: \"kubernetes.io/projected/38102000-b649-4560-8bfb-1d734ebffa09-kube-api-access-q2qwz\") on node \"crc\" DevicePath \"\"" Feb 25 11:54:29 crc kubenswrapper[4725]: I0225 11:54:29.190253 4725 scope.go:117] "RemoveContainer" containerID="5617ab4701f3d10b486304611637959badb74c5ddf9eac76d9364e9d430839aa" Feb 25 11:54:29 crc kubenswrapper[4725]: I0225 11:54:29.190287 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/crc-debug-z4h6g" Feb 25 11:54:29 crc kubenswrapper[4725]: I0225 11:54:29.237451 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38102000-b649-4560-8bfb-1d734ebffa09" path="/var/lib/kubelet/pods/38102000-b649-4560-8bfb-1d734ebffa09/volumes" Feb 25 11:54:41 crc kubenswrapper[4725]: I0225 11:54:41.554977 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:54:41 crc kubenswrapper[4725]: I0225 11:54:41.555421 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:54:42 crc kubenswrapper[4725]: I0225 11:54:42.947997 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84dc96ccc8-zhwrq_b4ab7d45-3a36-4ffc-9004-62ff70fbfe53/barbican-api/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.093507 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84dc96ccc8-zhwrq_b4ab7d45-3a36-4ffc-9004-62ff70fbfe53/barbican-api-log/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.145907 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b8b9cdb6b-d9zj4_b77182d3-74cf-4a61-a3a1-81efff62da8d/barbican-keystone-listener/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.238750 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b8b9cdb6b-d9zj4_b77182d3-74cf-4a61-a3a1-81efff62da8d/barbican-keystone-listener-log/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.394148 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6df8d5688f-fkmbb_09976716-81ab-4d43-8250-fe3812bc8029/barbican-worker-log/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.413642 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6df8d5688f-fkmbb_09976716-81ab-4d43-8250-fe3812bc8029/barbican-worker/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.630633 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl_a1b2db62-0e44-475c-bd55-aeceb2068aed/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.681857 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/ceilometer-central-agent/1.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.749945 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/ceilometer-central-agent/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.849359 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/ceilometer-notification-agent/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.874907 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/proxy-httpd/0.log" Feb 25 11:54:43 crc kubenswrapper[4725]: I0225 11:54:43.894224 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/sg-core/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.094416 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca608800-07d2-4b62-8ac2-e544a667d664/cinder-api/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.117351 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca608800-07d2-4b62-8ac2-e544a667d664/cinder-api-log/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.215259 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a023b0b-cd51-47db-9fdf-74c673713272/cinder-scheduler/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.348545 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a023b0b-cd51-47db-9fdf-74c673713272/probe/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.393150 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb_d3ef192a-3ad7-445f-b029-580b9e395372/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.722504 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5_3118e370-4c72-4fc4-bf2b-d27645473666/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.791104 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-hrfcv_f0789964-49e9-49e9-a6f5-133761c0d9f8/init/0.log" Feb 25 11:54:44 crc kubenswrapper[4725]: I0225 11:54:44.970741 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-hrfcv_f0789964-49e9-49e9-a6f5-133761c0d9f8/init/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.019572 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-hrfcv_f0789964-49e9-49e9-a6f5-133761c0d9f8/dnsmasq-dns/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.033215 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6_5bbf0497-1315-4613-b6ff-c826f5cf2a75/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.206752 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e/glance-httpd/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.214087 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e/glance-log/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.444209 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_993eb2eb-155b-419e-85a7-c59a25492dda/glance-httpd/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.469070 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_993eb2eb-155b-419e-85a7-c59a25492dda/glance-log/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.656043 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7cbf649584-gsrdx_f017ec2d-5d1b-405c-b2f7-b3212e3696d7/horizon/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.688083 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-skp48_b848df94-cae6-4ec8-bade-58be45c1cb4e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.914570 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7cbf649584-gsrdx_f017ec2d-5d1b-405c-b2f7-b3212e3696d7/horizon-log/0.log" Feb 25 11:54:45 crc kubenswrapper[4725]: I0225 11:54:45.982028 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gznp5_b40ab19d-a233-4263-b29f-390b5069752d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:46 crc kubenswrapper[4725]: I0225 11:54:46.194794 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c0e72df9-3fcc-4373-b1af-fac9d1bc5e99/kube-state-metrics/0.log" Feb 25 11:54:46 crc kubenswrapper[4725]: I0225 11:54:46.282308 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7dcb568bf7-chvcs_8145d393-0967-4acc-bd07-befcc3252202/keystone-api/0.log" Feb 25 11:54:46 crc kubenswrapper[4725]: I0225 11:54:46.353538 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n_6c225171-2b3a-414b-94d4-d73cc4d28b97/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:46 crc kubenswrapper[4725]: I0225 11:54:46.765526 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58868cbfd5-pvwdv_4971206d-e6f2-4355-8c47-9a7c9e1e51d6/neutron-api/0.log" Feb 25 11:54:46 crc kubenswrapper[4725]: I0225 11:54:46.795716 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58868cbfd5-pvwdv_4971206d-e6f2-4355-8c47-9a7c9e1e51d6/neutron-httpd/0.log" Feb 25 11:54:47 crc kubenswrapper[4725]: I0225 11:54:47.001123 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9_9479ee63-ae8c-4dfb-87f0-d92785a85f3b/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:47 crc kubenswrapper[4725]: I0225 11:54:47.711972 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e95c876-3305-4b1d-9062-dffe7e184ffd/nova-api-log/0.log" Feb 25 11:54:47 crc kubenswrapper[4725]: I0225 11:54:47.743885 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6/nova-cell0-conductor-conductor/0.log" Feb 25 11:54:47 crc kubenswrapper[4725]: I0225 11:54:47.959972 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e95c876-3305-4b1d-9062-dffe7e184ffd/nova-api-api/0.log" Feb 25 11:54:47 crc kubenswrapper[4725]: I0225 11:54:47.973962 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1e17e12f-d899-470f-8087-b92c47f46c5b/nova-cell1-conductor-conductor/0.log" Feb 25 11:54:48 crc kubenswrapper[4725]: I0225 11:54:48.070304 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_53df7811-b191-4c54-b2c4-5faed23e2cc3/nova-cell1-novncproxy-novncproxy/0.log" Feb 25 11:54:48 crc kubenswrapper[4725]: I0225 11:54:48.263897 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-n8lt7_4c1ac37f-ee50-4446-8433-5c3f1c427205/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:48 crc kubenswrapper[4725]: I0225 11:54:48.392926 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_670a8e0c-fb4b-4311-b236-41a3f10c1ad2/nova-metadata-log/0.log" Feb 25 11:54:48 crc kubenswrapper[4725]: I0225 11:54:48.841388 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fe5c0a24-642c-4173-9b00-3d5a327f669e/nova-scheduler-scheduler/0.log" Feb 25 11:54:48 crc kubenswrapper[4725]: I0225 11:54:48.879910 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_99ef16ee-b18a-4374-9b14-0d6e08df5558/mysql-bootstrap/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.084858 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_99ef16ee-b18a-4374-9b14-0d6e08df5558/mysql-bootstrap/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.145546 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_99ef16ee-b18a-4374-9b14-0d6e08df5558/galera/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.279314 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6c23a18-36cf-4d71-885d-f2b93ba16375/mysql-bootstrap/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.457880 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6c23a18-36cf-4d71-885d-f2b93ba16375/mysql-bootstrap/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.508222 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6c23a18-36cf-4d71-885d-f2b93ba16375/galera/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.694487 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_71cbcb8e-872e-48b4-93a9-f5ee2edb3746/openstackclient/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.751166 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hlw77_82a07d0a-26d5-463c-95aa-eb022c49ac9d/openstack-network-exporter/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.830335 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_670a8e0c-fb4b-4311-b236-41a3f10c1ad2/nova-metadata-metadata/0.log" Feb 25 11:54:49 crc kubenswrapper[4725]: I0225 11:54:49.928591 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovsdb-server-init/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.220715 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovsdb-server/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.226514 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovs-vswitchd/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.244232 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovsdb-server-init/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.449367 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xpvnr_d2445fb4-75ca-4ea2-b979-5757105279ab/ovn-controller/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.554244 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rgwzr_65453adf-918b-40e1-bce0-4d4cb4ab7f56/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.701911 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_254996fe-9d34-46de-8e63-d4762c639a24/openstack-network-exporter/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.735050 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_254996fe-9d34-46de-8e63-d4762c639a24/ovn-northd/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.834696 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_24e787b7-ef1d-4c61-b01a-f8119d7911c0/openstack-network-exporter/0.log" Feb 25 11:54:50 crc kubenswrapper[4725]: I0225 11:54:50.962072 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_24e787b7-ef1d-4c61-b01a-f8119d7911c0/ovsdbserver-nb/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.083873 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_818d1929-2446-4ce6-80ec-6ed3fdec2b3d/ovsdbserver-sb/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.095962 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_818d1929-2446-4ce6-80ec-6ed3fdec2b3d/openstack-network-exporter/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.350570 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69c7668f4d-s7tf6_502da0ce-a7f4-4af1-87a8-f9a7bb197b39/placement-api/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.387104 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69c7668f4d-s7tf6_502da0ce-a7f4-4af1-87a8-f9a7bb197b39/placement-log/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.447063 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5bb7295b-193b-45b6-8913-8508d190e664/setup-container/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.670086 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8cd71ea0-569c-4093-931d-2e0c841bcbf4/setup-container/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.707176 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5bb7295b-193b-45b6-8913-8508d190e664/setup-container/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.707496 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5bb7295b-193b-45b6-8913-8508d190e664/rabbitmq/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.920143 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8cd71ea0-569c-4093-931d-2e0c841bcbf4/setup-container/0.log" Feb 25 11:54:51 crc kubenswrapper[4725]: I0225 11:54:51.938556 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8cd71ea0-569c-4093-931d-2e0c841bcbf4/rabbitmq/0.log" Feb 25 11:54:52 crc kubenswrapper[4725]: I0225 11:54:52.002852 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq_2b6f1103-ca9d-4e09-9816-83e1751a56ff/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:52 crc kubenswrapper[4725]: I0225 11:54:52.329760 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw_c034211a-1e4c-4636-9f07-a8c4b89bed34/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:52 crc kubenswrapper[4725]: I0225 11:54:52.358112 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-mlkj8_a8206236-adf4-4501-bbc7-6333709aa101/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:52 crc kubenswrapper[4725]: I0225 11:54:52.590578 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gx8f9_6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:52 crc kubenswrapper[4725]: I0225 11:54:52.668067 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rfwzw_f5f7958b-17b2-40ba-a17b-bc8eefa6d59d/ssh-known-hosts-edpm-deployment/0.log" Feb 25 11:54:52 crc kubenswrapper[4725]: I0225 11:54:52.892335 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d86f859c9-f94qp_cdb91fb4-91c1-4761-8724-24a845ee9d03/proxy-server/0.log" Feb 25 11:54:52 crc kubenswrapper[4725]: I0225 11:54:52.936261 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d86f859c9-f94qp_cdb91fb4-91c1-4761-8724-24a845ee9d03/proxy-httpd/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.010059 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-zc6sk_c5574881-8546-456a-96b2-d58158e8a447/swift-ring-rebalance/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.183646 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-auditor/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.230549 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-reaper/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.257877 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-replicator/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.373971 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-server/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.468287 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-auditor/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.469902 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-server/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.484571 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-replicator/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.602046 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-updater/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.711071 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-replicator/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.714524 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-expirer/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.811725 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-auditor/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.818677 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-server/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.901071 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/rsync/0.log" Feb 25 11:54:53 crc kubenswrapper[4725]: I0225 11:54:53.928950 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-updater/0.log" Feb 25 11:54:54 crc kubenswrapper[4725]: I0225 11:54:54.103627 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/swift-recon-cron/0.log" Feb 25 11:54:54 crc kubenswrapper[4725]: I0225 11:54:54.190961 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dg75m_07f2c78c-f46d-4751-ae1b-ac502a378ff4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:54:54 crc kubenswrapper[4725]: I0225 11:54:54.357596 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_07081f50-997d-4877-be58-a446955dfe62/tempest-tests-tempest-tests-runner/0.log" Feb 25 11:54:54 crc kubenswrapper[4725]: I0225 11:54:54.375968 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6d6cd6ff-f8a8-4cab-b786-90440d19dbf1/test-operator-logs-container/0.log" Feb 25 11:54:54 crc kubenswrapper[4725]: I0225 11:54:54.571872 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-p46vh_6d50a11f-90c8-490f-90a3-9fb2c14f2bea/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 11:55:02 crc kubenswrapper[4725]: I0225 11:55:02.191502 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a30e3088-499a-491e-a9b0-65e54ac709c9/memcached/0.log" Feb 25 11:55:11 crc kubenswrapper[4725]: I0225 11:55:11.555119 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 11:55:11 crc kubenswrapper[4725]: I0225 11:55:11.555678 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 11:55:11 crc kubenswrapper[4725]: I0225 11:55:11.555722 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 11:55:11 crc kubenswrapper[4725]: I0225 11:55:11.556434 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 11:55:11 crc kubenswrapper[4725]: I0225 11:55:11.556486 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" gracePeriod=600 Feb 25 11:55:11 crc kubenswrapper[4725]: E0225 11:55:11.684751 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:55:12 crc kubenswrapper[4725]: I0225 11:55:12.586562 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" exitCode=0 Feb 25 11:55:12 crc kubenswrapper[4725]: I0225 11:55:12.586613 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c"} Feb 25 11:55:12 crc kubenswrapper[4725]: I0225 11:55:12.586653 4725 scope.go:117] "RemoveContainer" containerID="55972f279b171bec5e6d0dee8be26569a49cd30b83e5c71721b156cab7b1e025" Feb 25 11:55:12 crc kubenswrapper[4725]: I0225 11:55:12.587163 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:55:12 crc kubenswrapper[4725]: E0225 11:55:12.587440 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:55:20 crc kubenswrapper[4725]: I0225 11:55:20.355884 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-gm94c_a897851d-6b6d-40e1-82f2-ef4db97b19d9/manager/0.log" Feb 25 11:55:20 crc kubenswrapper[4725]: I0225 11:55:20.575319 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/util/0.log" Feb 25 11:55:20 crc kubenswrapper[4725]: I0225 11:55:20.816661 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/pull/0.log" Feb 25 11:55:20 crc kubenswrapper[4725]: I0225 11:55:20.831785 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/util/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.100244 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/pull/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.220711 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/util/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.276886 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-65rfv_27540507-aac9-4fd2-84a9-34a2a20885d7/manager/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.314348 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/pull/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.456943 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/extract/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.720174 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-97g26_22854bfa-3684-4750-b2f7-e5ccbe3e92fb/manager/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.789028 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-wj5dw_0755d178-0ceb-41f1-a26c-e96e466f8300/manager/0.log" Feb 25 11:55:21 crc kubenswrapper[4725]: I0225 11:55:21.959368 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-j4hbq_6cf86133-a9ef-4a8b-a957-ef8e588b200e/manager/0.log" Feb 25 11:55:22 crc kubenswrapper[4725]: I0225 11:55:22.237149 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-h2tmg_015fdc09-2359-48f1-9800-9d44efc254fc/manager/0.log" Feb 25 11:55:22 crc kubenswrapper[4725]: I0225 11:55:22.552370 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-6872z_b82c26d2-a08f-4c57-a876-9ac8a87c1fcf/manager/0.log" Feb 25 11:55:22 crc kubenswrapper[4725]: I0225 11:55:22.573031 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-2vhq7_9a7b2bf7-fab5-4634-9dfa-147dc2de21bc/manager/0.log" Feb 25 11:55:22 crc kubenswrapper[4725]: I0225 11:55:22.766306 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-25sql_0279e1a1-c275-48e8-815c-0afae718b93a/manager/0.log" Feb 25 11:55:22 crc kubenswrapper[4725]: I0225 11:55:22.969669 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-6s7s5_5b458e63-ce2e-4d37-9509-5b31170d932f/manager/0.log" Feb 25 11:55:23 crc kubenswrapper[4725]: I0225 11:55:23.083727 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-pxnr7_c07a7a9d-d976-4d10-af1d-b92b5da76d71/manager/0.log" Feb 25 11:55:23 crc kubenswrapper[4725]: I0225 11:55:23.415591 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-kn6fp_ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6/manager/0.log" Feb 25 11:55:23 crc kubenswrapper[4725]: I0225 11:55:23.452132 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-8fthg_37d48839-36c8-4a2c-ac3d-a4e5394b11eb/manager/0.log" Feb 25 11:55:23 crc kubenswrapper[4725]: I0225 11:55:23.675203 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd_2fbb069d-66ce-4d87-9fcb-f82181bd85e9/manager/0.log" Feb 25 11:55:24 crc kubenswrapper[4725]: I0225 11:55:24.224862 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:55:24 crc kubenswrapper[4725]: E0225 11:55:24.225404 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:55:24 crc kubenswrapper[4725]: I0225 11:55:24.620741 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-74c9788cdf-zqhdj_267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe/operator/0.log" Feb 25 11:55:24 crc kubenswrapper[4725]: I0225 11:55:24.851135 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sn5wb_52de4181-d70f-4961-abfe-957862ec7ed0/registry-server/0.log" Feb 25 11:55:25 crc kubenswrapper[4725]: I0225 11:55:25.161948 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-lgqlc_07870810-90ed-47a5-90f5-b684700f7092/manager/0.log" Feb 25 11:55:25 crc kubenswrapper[4725]: I0225 11:55:25.167224 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-v8c26_4b18c8c4-1868-4383-b2d7-d9b3c9a33e03/manager/0.log" Feb 25 11:55:25 crc kubenswrapper[4725]: I0225 11:55:25.437251 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-bzx24_9921b017-bf1b-457d-b9ec-b344b0fabd1c/operator/0.log" Feb 25 11:55:25 crc kubenswrapper[4725]: I0225 11:55:25.530499 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-mvqqg_01823ef1-1bcc-49f8-8cbc-37db7edc9fd0/manager/0.log" Feb 25 11:55:25 crc kubenswrapper[4725]: I0225 11:55:25.733292 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-t2ncn_cf5974e9-29dc-4274-8f65-9cf82450bdfc/manager/0.log" Feb 25 11:55:25 crc kubenswrapper[4725]: I0225 11:55:25.856110 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-8gchs_2b257035-93ff-456f-8aaa-e370a1756b0e/manager/0.log" Feb 25 11:55:26 crc kubenswrapper[4725]: I0225 11:55:26.042706 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-6lfbp_e1b06e72-2952-4eee-9732-af05abc6a117/manager/0.log" Feb 25 11:55:26 crc kubenswrapper[4725]: I0225 11:55:26.420694 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7489bcf59c-kb5pq_b6b802f9-7adb-43ca-b8ae-de7bacb908fb/manager/0.log" Feb 25 11:55:26 crc kubenswrapper[4725]: I0225 11:55:26.910175 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-l278b_41775582-fd78-4c34-93fc-60b9cdc55a2c/manager/0.log" Feb 25 11:55:36 crc kubenswrapper[4725]: I0225 11:55:36.224271 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:55:36 crc kubenswrapper[4725]: E0225 11:55:36.225177 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:55:44 crc kubenswrapper[4725]: I0225 11:55:44.501711 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gbzbf_93efef4f-c6c1-47b8-ba83-12c56c3b08ea/control-plane-machine-set-operator/0.log" Feb 25 11:55:44 crc kubenswrapper[4725]: I0225 11:55:44.603231 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mw7b2_58ea6113-66d2-421d-b7cd-723463055f04/kube-rbac-proxy/0.log" Feb 25 11:55:44 crc kubenswrapper[4725]: I0225 11:55:44.661786 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mw7b2_58ea6113-66d2-421d-b7cd-723463055f04/machine-api-operator/0.log" Feb 25 11:55:50 crc kubenswrapper[4725]: I0225 11:55:50.224526 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:55:50 crc kubenswrapper[4725]: E0225 11:55:50.225233 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:55:56 crc kubenswrapper[4725]: I0225 11:55:56.102115 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-k9jtf_32a638a3-425e-4564-b5e1-b11c3d332ed6/cert-manager-controller/0.log" Feb 25 11:55:56 crc kubenswrapper[4725]: I0225 11:55:56.252325 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6xsxb_5f1f7118-2524-4653-9a60-82142d16ef44/cert-manager-cainjector/0.log" Feb 25 11:55:56 crc kubenswrapper[4725]: I0225 11:55:56.341230 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-6wnjw_899495ee-adcf-4350-a1b3-6a3cdd8c9d42/cert-manager-webhook/0.log" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.143486 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533676-9qm58"] Feb 25 11:56:00 crc kubenswrapper[4725]: E0225 11:56:00.144494 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38102000-b649-4560-8bfb-1d734ebffa09" containerName="container-00" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.144510 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="38102000-b649-4560-8bfb-1d734ebffa09" containerName="container-00" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.144710 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="38102000-b649-4560-8bfb-1d734ebffa09" containerName="container-00" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.145451 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533676-9qm58" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.147879 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.153596 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.153848 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.156706 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533676-9qm58"] Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.256889 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vdzq\" (UniqueName: \"kubernetes.io/projected/720b297c-f3c3-4eac-8997-56451b4a2427-kube-api-access-5vdzq\") pod \"auto-csr-approver-29533676-9qm58\" (UID: \"720b297c-f3c3-4eac-8997-56451b4a2427\") " pod="openshift-infra/auto-csr-approver-29533676-9qm58" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.359877 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vdzq\" (UniqueName: \"kubernetes.io/projected/720b297c-f3c3-4eac-8997-56451b4a2427-kube-api-access-5vdzq\") pod \"auto-csr-approver-29533676-9qm58\" (UID: \"720b297c-f3c3-4eac-8997-56451b4a2427\") " pod="openshift-infra/auto-csr-approver-29533676-9qm58" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.384131 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vdzq\" (UniqueName: \"kubernetes.io/projected/720b297c-f3c3-4eac-8997-56451b4a2427-kube-api-access-5vdzq\") pod \"auto-csr-approver-29533676-9qm58\" (UID: \"720b297c-f3c3-4eac-8997-56451b4a2427\") " pod="openshift-infra/auto-csr-approver-29533676-9qm58" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.476698 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533676-9qm58" Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.907551 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533676-9qm58"] Feb 25 11:56:00 crc kubenswrapper[4725]: I0225 11:56:00.910152 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 11:56:01 crc kubenswrapper[4725]: I0225 11:56:01.038940 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533676-9qm58" event={"ID":"720b297c-f3c3-4eac-8997-56451b4a2427","Type":"ContainerStarted","Data":"15201c006c973add479db6d689db0cfc0ccbca34f50d5010a7ac75e8783f8c0c"} Feb 25 11:56:01 crc kubenswrapper[4725]: I0225 11:56:01.224820 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:56:01 crc kubenswrapper[4725]: E0225 11:56:01.225183 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:56:03 crc kubenswrapper[4725]: I0225 11:56:03.058066 4725 generic.go:334] "Generic (PLEG): container finished" podID="720b297c-f3c3-4eac-8997-56451b4a2427" containerID="f21e36498f6fad3fc49c5e0bdbd75ed28aab7fb6b98bfae792e34203bab2f537" exitCode=0 Feb 25 11:56:03 crc kubenswrapper[4725]: I0225 11:56:03.058142 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533676-9qm58" event={"ID":"720b297c-f3c3-4eac-8997-56451b4a2427","Type":"ContainerDied","Data":"f21e36498f6fad3fc49c5e0bdbd75ed28aab7fb6b98bfae792e34203bab2f537"} Feb 25 11:56:04 crc kubenswrapper[4725]: I0225 11:56:04.383763 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533676-9qm58" Feb 25 11:56:04 crc kubenswrapper[4725]: I0225 11:56:04.546955 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vdzq\" (UniqueName: \"kubernetes.io/projected/720b297c-f3c3-4eac-8997-56451b4a2427-kube-api-access-5vdzq\") pod \"720b297c-f3c3-4eac-8997-56451b4a2427\" (UID: \"720b297c-f3c3-4eac-8997-56451b4a2427\") " Feb 25 11:56:04 crc kubenswrapper[4725]: I0225 11:56:04.557083 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720b297c-f3c3-4eac-8997-56451b4a2427-kube-api-access-5vdzq" (OuterVolumeSpecName: "kube-api-access-5vdzq") pod "720b297c-f3c3-4eac-8997-56451b4a2427" (UID: "720b297c-f3c3-4eac-8997-56451b4a2427"). InnerVolumeSpecName "kube-api-access-5vdzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:56:04 crc kubenswrapper[4725]: I0225 11:56:04.649621 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vdzq\" (UniqueName: \"kubernetes.io/projected/720b297c-f3c3-4eac-8997-56451b4a2427-kube-api-access-5vdzq\") on node \"crc\" DevicePath \"\"" Feb 25 11:56:05 crc kubenswrapper[4725]: I0225 11:56:05.082011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533676-9qm58" event={"ID":"720b297c-f3c3-4eac-8997-56451b4a2427","Type":"ContainerDied","Data":"15201c006c973add479db6d689db0cfc0ccbca34f50d5010a7ac75e8783f8c0c"} Feb 25 11:56:05 crc kubenswrapper[4725]: I0225 11:56:05.082306 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15201c006c973add479db6d689db0cfc0ccbca34f50d5010a7ac75e8783f8c0c" Feb 25 11:56:05 crc kubenswrapper[4725]: I0225 11:56:05.082081 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533676-9qm58" Feb 25 11:56:05 crc kubenswrapper[4725]: I0225 11:56:05.450985 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533670-klrw7"] Feb 25 11:56:05 crc kubenswrapper[4725]: I0225 11:56:05.459318 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533670-klrw7"] Feb 25 11:56:07 crc kubenswrapper[4725]: I0225 11:56:07.234510 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d6f332-ef19-43b5-8fb7-566663952b6c" path="/var/lib/kubelet/pods/c4d6f332-ef19-43b5-8fb7-566663952b6c/volumes" Feb 25 11:56:07 crc kubenswrapper[4725]: I0225 11:56:07.848035 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-mk4rx_0b1364b5-8725-454d-962e-a8c86ca27c2b/nmstate-console-plugin/0.log" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.034148 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-4xp96_82055e0c-d941-42a4-a029-e58f3893b303/nmstate-handler/0.log" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.077532 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-w2z2q_e0523051-56ca-4df3-ae89-488db2c9c37a/kube-rbac-proxy/0.log" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.170153 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-w2z2q_e0523051-56ca-4df3-ae89-488db2c9c37a/nmstate-metrics/0.log" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.247449 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rrwts_7d7a9448-ae03-426a-8c08-5823c6097b8c/nmstate-operator/0.log" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.355095 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-sgrvv_93f80fd4-e221-4c81-ab48-77beb578add9/nmstate-webhook/0.log" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.751149 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zmfmw"] Feb 25 11:56:08 crc kubenswrapper[4725]: E0225 11:56:08.751620 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720b297c-f3c3-4eac-8997-56451b4a2427" containerName="oc" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.751644 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="720b297c-f3c3-4eac-8997-56451b4a2427" containerName="oc" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.751857 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="720b297c-f3c3-4eac-8997-56451b4a2427" containerName="oc" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.753212 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.765648 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmfmw"] Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.832334 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-utilities\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.832655 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-catalog-content\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.832747 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxcr8\" (UniqueName: \"kubernetes.io/projected/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-kube-api-access-nxcr8\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.934503 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-catalog-content\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.934574 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxcr8\" (UniqueName: \"kubernetes.io/projected/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-kube-api-access-nxcr8\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.934667 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-utilities\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.935022 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-catalog-content\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.935145 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-utilities\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:08 crc kubenswrapper[4725]: I0225 11:56:08.957263 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxcr8\" (UniqueName: \"kubernetes.io/projected/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-kube-api-access-nxcr8\") pod \"certified-operators-zmfmw\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:09 crc kubenswrapper[4725]: I0225 11:56:09.072936 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:09 crc kubenswrapper[4725]: I0225 11:56:09.616800 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmfmw"] Feb 25 11:56:10 crc kubenswrapper[4725]: I0225 11:56:10.144522 4725 generic.go:334] "Generic (PLEG): container finished" podID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerID="29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd" exitCode=0 Feb 25 11:56:10 crc kubenswrapper[4725]: I0225 11:56:10.144559 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmfmw" event={"ID":"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5","Type":"ContainerDied","Data":"29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd"} Feb 25 11:56:10 crc kubenswrapper[4725]: I0225 11:56:10.144583 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmfmw" event={"ID":"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5","Type":"ContainerStarted","Data":"b5a4c7e12bcc389a08343754644211db479e404ef55081f998b1cf16d48b07e8"} Feb 25 11:56:11 crc kubenswrapper[4725]: I0225 11:56:11.155159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmfmw" event={"ID":"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5","Type":"ContainerStarted","Data":"e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1"} Feb 25 11:56:12 crc kubenswrapper[4725]: I0225 11:56:12.167723 4725 generic.go:334] "Generic (PLEG): container finished" podID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerID="e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1" exitCode=0 Feb 25 11:56:12 crc kubenswrapper[4725]: I0225 11:56:12.167879 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmfmw" event={"ID":"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5","Type":"ContainerDied","Data":"e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1"} Feb 25 11:56:15 crc kubenswrapper[4725]: I0225 11:56:15.196410 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmfmw" event={"ID":"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5","Type":"ContainerStarted","Data":"98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a"} Feb 25 11:56:15 crc kubenswrapper[4725]: I0225 11:56:15.217529 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zmfmw" podStartSLOduration=3.314302012 podStartE2EDuration="7.217506569s" podCreationTimestamp="2026-02-25 11:56:08 +0000 UTC" firstStartedPulling="2026-02-25 11:56:10.146764625 +0000 UTC m=+3795.645346650" lastFinishedPulling="2026-02-25 11:56:14.049969172 +0000 UTC m=+3799.548551207" observedRunningTime="2026-02-25 11:56:15.214035277 +0000 UTC m=+3800.712617322" watchObservedRunningTime="2026-02-25 11:56:15.217506569 +0000 UTC m=+3800.716088594" Feb 25 11:56:15 crc kubenswrapper[4725]: I0225 11:56:15.235822 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:56:15 crc kubenswrapper[4725]: E0225 11:56:15.236192 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:56:19 crc kubenswrapper[4725]: I0225 11:56:19.073515 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:19 crc kubenswrapper[4725]: I0225 11:56:19.074020 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:19 crc kubenswrapper[4725]: I0225 11:56:19.125021 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:19 crc kubenswrapper[4725]: I0225 11:56:19.286481 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:19 crc kubenswrapper[4725]: I0225 11:56:19.360087 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmfmw"] Feb 25 11:56:21 crc kubenswrapper[4725]: I0225 11:56:21.259566 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zmfmw" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="registry-server" containerID="cri-o://98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a" gracePeriod=2 Feb 25 11:56:21 crc kubenswrapper[4725]: I0225 11:56:21.787731 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:21 crc kubenswrapper[4725]: I0225 11:56:21.974422 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-utilities\") pod \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " Feb 25 11:56:21 crc kubenswrapper[4725]: I0225 11:56:21.974467 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-catalog-content\") pod \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " Feb 25 11:56:21 crc kubenswrapper[4725]: I0225 11:56:21.974509 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxcr8\" (UniqueName: \"kubernetes.io/projected/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-kube-api-access-nxcr8\") pod \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\" (UID: \"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5\") " Feb 25 11:56:21 crc kubenswrapper[4725]: I0225 11:56:21.975429 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-utilities" (OuterVolumeSpecName: "utilities") pod "9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" (UID: "9d9c26a7-3282-44fe-a8ff-719d7f30d8b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:56:21 crc kubenswrapper[4725]: I0225 11:56:21.979899 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-kube-api-access-nxcr8" (OuterVolumeSpecName: "kube-api-access-nxcr8") pod "9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" (UID: "9d9c26a7-3282-44fe-a8ff-719d7f30d8b5"). InnerVolumeSpecName "kube-api-access-nxcr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.029166 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" (UID: "9d9c26a7-3282-44fe-a8ff-719d7f30d8b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.076337 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.076377 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.076390 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxcr8\" (UniqueName: \"kubernetes.io/projected/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5-kube-api-access-nxcr8\") on node \"crc\" DevicePath \"\"" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.237716 4725 scope.go:117] "RemoveContainer" containerID="fde251ff53c95f8cda1be1c9280fad856e3207095c23855ce8674dc5c2b711ba" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.268477 4725 generic.go:334] "Generic (PLEG): container finished" podID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerID="98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a" exitCode=0 Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.268527 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmfmw" event={"ID":"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5","Type":"ContainerDied","Data":"98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a"} Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.268568 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmfmw" event={"ID":"9d9c26a7-3282-44fe-a8ff-719d7f30d8b5","Type":"ContainerDied","Data":"b5a4c7e12bcc389a08343754644211db479e404ef55081f998b1cf16d48b07e8"} Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.268589 4725 scope.go:117] "RemoveContainer" containerID="98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.268732 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmfmw" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.317257 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmfmw"] Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.322758 4725 scope.go:117] "RemoveContainer" containerID="e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.325710 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zmfmw"] Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.346201 4725 scope.go:117] "RemoveContainer" containerID="29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.361685 4725 scope.go:117] "RemoveContainer" containerID="98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a" Feb 25 11:56:22 crc kubenswrapper[4725]: E0225 11:56:22.362364 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a\": container with ID starting with 98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a not found: ID does not exist" containerID="98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.362408 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a"} err="failed to get container status \"98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a\": rpc error: code = NotFound desc = could not find container \"98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a\": container with ID starting with 98fa3a8befa4052763cb97cb2a072bce3f98a5299eb69348111f1d1ac8a2882a not found: ID does not exist" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.362435 4725 scope.go:117] "RemoveContainer" containerID="e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1" Feb 25 11:56:22 crc kubenswrapper[4725]: E0225 11:56:22.362885 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1\": container with ID starting with e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1 not found: ID does not exist" containerID="e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.362934 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1"} err="failed to get container status \"e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1\": rpc error: code = NotFound desc = could not find container \"e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1\": container with ID starting with e77af8841b1b737671836b57223287f2305c36304b791c1a1515f4e3493517e1 not found: ID does not exist" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.362961 4725 scope.go:117] "RemoveContainer" containerID="29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd" Feb 25 11:56:22 crc kubenswrapper[4725]: E0225 11:56:22.363229 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd\": container with ID starting with 29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd not found: ID does not exist" containerID="29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd" Feb 25 11:56:22 crc kubenswrapper[4725]: I0225 11:56:22.363262 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd"} err="failed to get container status \"29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd\": rpc error: code = NotFound desc = could not find container \"29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd\": container with ID starting with 29757b0ca70537b2b358e9279c078c4044ca138256eb819f5659e67729d1b0cd not found: ID does not exist" Feb 25 11:56:23 crc kubenswrapper[4725]: I0225 11:56:23.249622 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" path="/var/lib/kubelet/pods/9d9c26a7-3282-44fe-a8ff-719d7f30d8b5/volumes" Feb 25 11:56:30 crc kubenswrapper[4725]: I0225 11:56:30.225166 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:56:30 crc kubenswrapper[4725]: E0225 11:56:30.226303 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:56:34 crc kubenswrapper[4725]: I0225 11:56:34.275404 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-67xsd_beb65949-bc67-4f16-892c-8979cc412e9e/kube-rbac-proxy/0.log" Feb 25 11:56:34 crc kubenswrapper[4725]: I0225 11:56:34.404563 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-67xsd_beb65949-bc67-4f16-892c-8979cc412e9e/controller/0.log" Feb 25 11:56:34 crc kubenswrapper[4725]: I0225 11:56:34.514634 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 11:56:34 crc kubenswrapper[4725]: I0225 11:56:34.826795 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 11:56:34 crc kubenswrapper[4725]: I0225 11:56:34.844351 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 11:56:34 crc kubenswrapper[4725]: I0225 11:56:34.870331 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 11:56:34 crc kubenswrapper[4725]: I0225 11:56:34.878724 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.094296 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.151940 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.161421 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.177773 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.334386 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.343340 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.384483 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.439892 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/controller/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.592193 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/frr-metrics/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.691962 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/kube-rbac-proxy/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.697748 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/kube-rbac-proxy-frr/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.878734 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/reloader/0.log" Feb 25 11:56:35 crc kubenswrapper[4725]: I0225 11:56:35.981519 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-xn4fl_bcc5161a-6f59-4878-a2ae-5f4a533021c3/frr-k8s-webhook-server/0.log" Feb 25 11:56:36 crc kubenswrapper[4725]: I0225 11:56:36.215785 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-768ffd8bd5-q5ktr_14923832-70ad-4019-b795-4094d767dfda/manager/0.log" Feb 25 11:56:36 crc kubenswrapper[4725]: I0225 11:56:36.385718 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56448fcbcf-jpqnm_50bf643b-abcd-4134-bfd5-a08256ad5652/webhook-server/0.log" Feb 25 11:56:36 crc kubenswrapper[4725]: I0225 11:56:36.483695 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-svwnh_b577777e-718a-4f09-a76a-98aa4f068184/kube-rbac-proxy/0.log" Feb 25 11:56:36 crc kubenswrapper[4725]: I0225 11:56:36.979083 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/frr/0.log" Feb 25 11:56:37 crc kubenswrapper[4725]: I0225 11:56:37.519249 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-svwnh_b577777e-718a-4f09-a76a-98aa4f068184/speaker/0.log" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.164139 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8c45k"] Feb 25 11:56:40 crc kubenswrapper[4725]: E0225 11:56:40.164921 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="extract-utilities" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.164939 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="extract-utilities" Feb 25 11:56:40 crc kubenswrapper[4725]: E0225 11:56:40.164961 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="extract-content" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.164969 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="extract-content" Feb 25 11:56:40 crc kubenswrapper[4725]: E0225 11:56:40.164990 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="registry-server" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.164999 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="registry-server" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.165237 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9c26a7-3282-44fe-a8ff-719d7f30d8b5" containerName="registry-server" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.166892 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.175865 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8c45k"] Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.213839 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-catalog-content\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.213887 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-utilities\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.213946 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5qln\" (UniqueName: \"kubernetes.io/projected/b88c4b73-e22a-4e07-aef6-574b2605a282-kube-api-access-q5qln\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.315433 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-catalog-content\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.315474 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-utilities\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.315524 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5qln\" (UniqueName: \"kubernetes.io/projected/b88c4b73-e22a-4e07-aef6-574b2605a282-kube-api-access-q5qln\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.316965 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-catalog-content\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.317188 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-utilities\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.335686 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5qln\" (UniqueName: \"kubernetes.io/projected/b88c4b73-e22a-4e07-aef6-574b2605a282-kube-api-access-q5qln\") pod \"redhat-operators-8c45k\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.489886 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:40 crc kubenswrapper[4725]: I0225 11:56:40.953673 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8c45k"] Feb 25 11:56:41 crc kubenswrapper[4725]: I0225 11:56:41.443429 4725 generic.go:334] "Generic (PLEG): container finished" podID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerID="2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f" exitCode=0 Feb 25 11:56:41 crc kubenswrapper[4725]: I0225 11:56:41.443576 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c45k" event={"ID":"b88c4b73-e22a-4e07-aef6-574b2605a282","Type":"ContainerDied","Data":"2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f"} Feb 25 11:56:41 crc kubenswrapper[4725]: I0225 11:56:41.444684 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c45k" event={"ID":"b88c4b73-e22a-4e07-aef6-574b2605a282","Type":"ContainerStarted","Data":"0ba0bc50cddfead991f4b09ee38ada4005a68cd1239e9031290cee10317d8779"} Feb 25 11:56:43 crc kubenswrapper[4725]: I0225 11:56:43.465630 4725 generic.go:334] "Generic (PLEG): container finished" podID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerID="8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96" exitCode=0 Feb 25 11:56:43 crc kubenswrapper[4725]: I0225 11:56:43.466549 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c45k" event={"ID":"b88c4b73-e22a-4e07-aef6-574b2605a282","Type":"ContainerDied","Data":"8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96"} Feb 25 11:56:44 crc kubenswrapper[4725]: I0225 11:56:44.224195 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:56:44 crc kubenswrapper[4725]: E0225 11:56:44.225052 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:56:44 crc kubenswrapper[4725]: I0225 11:56:44.476422 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c45k" event={"ID":"b88c4b73-e22a-4e07-aef6-574b2605a282","Type":"ContainerStarted","Data":"61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03"} Feb 25 11:56:44 crc kubenswrapper[4725]: I0225 11:56:44.501848 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8c45k" podStartSLOduration=2.063362495 podStartE2EDuration="4.501803385s" podCreationTimestamp="2026-02-25 11:56:40 +0000 UTC" firstStartedPulling="2026-02-25 11:56:41.445379863 +0000 UTC m=+3826.943961888" lastFinishedPulling="2026-02-25 11:56:43.883820753 +0000 UTC m=+3829.382402778" observedRunningTime="2026-02-25 11:56:44.494006908 +0000 UTC m=+3829.992588963" watchObservedRunningTime="2026-02-25 11:56:44.501803385 +0000 UTC m=+3830.000385410" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.490872 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.491340 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.547535 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.597513 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.739877 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/util/0.log" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.785965 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8c45k"] Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.897611 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/util/0.log" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.927203 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/pull/0.log" Feb 25 11:56:50 crc kubenswrapper[4725]: I0225 11:56:50.935764 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/pull/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.143362 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/extract/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.176898 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/util/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.242268 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/pull/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.324400 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-utilities/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.473282 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-utilities/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.498605 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-content/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.503684 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-content/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.653974 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-utilities/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.728939 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-content/0.log" Feb 25 11:56:51 crc kubenswrapper[4725]: I0225 11:56:51.868683 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-utilities/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.160056 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-content/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.161514 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-utilities/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.165953 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-content/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.343464 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-content/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.354959 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/registry-server/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.381886 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-utilities/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.543209 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8c45k" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="registry-server" containerID="cri-o://61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03" gracePeriod=2 Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.620552 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/util/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.732405 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/util/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.760527 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/pull/0.log" Feb 25 11:56:52 crc kubenswrapper[4725]: I0225 11:56:52.835327 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/pull/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.047054 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/extract/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.152230 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/util/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.192683 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/pull/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.431001 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.495945 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k82sj_d18563e5-7e1f-4e98-9419-d71fa34b9fd2/marketplace-operator/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.555062 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-utilities\") pod \"b88c4b73-e22a-4e07-aef6-574b2605a282\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.555128 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5qln\" (UniqueName: \"kubernetes.io/projected/b88c4b73-e22a-4e07-aef6-574b2605a282-kube-api-access-q5qln\") pod \"b88c4b73-e22a-4e07-aef6-574b2605a282\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.555157 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-catalog-content\") pod \"b88c4b73-e22a-4e07-aef6-574b2605a282\" (UID: \"b88c4b73-e22a-4e07-aef6-574b2605a282\") " Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.557252 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-utilities" (OuterVolumeSpecName: "utilities") pod "b88c4b73-e22a-4e07-aef6-574b2605a282" (UID: "b88c4b73-e22a-4e07-aef6-574b2605a282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.562489 4725 generic.go:334] "Generic (PLEG): container finished" podID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerID="61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03" exitCode=0 Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.562525 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c45k" event={"ID":"b88c4b73-e22a-4e07-aef6-574b2605a282","Type":"ContainerDied","Data":"61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03"} Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.562551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c45k" event={"ID":"b88c4b73-e22a-4e07-aef6-574b2605a282","Type":"ContainerDied","Data":"0ba0bc50cddfead991f4b09ee38ada4005a68cd1239e9031290cee10317d8779"} Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.562568 4725 scope.go:117] "RemoveContainer" containerID="61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.562718 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c45k" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.563638 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88c4b73-e22a-4e07-aef6-574b2605a282-kube-api-access-q5qln" (OuterVolumeSpecName: "kube-api-access-q5qln") pod "b88c4b73-e22a-4e07-aef6-574b2605a282" (UID: "b88c4b73-e22a-4e07-aef6-574b2605a282"). InnerVolumeSpecName "kube-api-access-q5qln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.600064 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/registry-server/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.632870 4725 scope.go:117] "RemoveContainer" containerID="8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.651390 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-utilities/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.658405 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5qln\" (UniqueName: \"kubernetes.io/projected/b88c4b73-e22a-4e07-aef6-574b2605a282-kube-api-access-q5qln\") on node \"crc\" DevicePath \"\"" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.658438 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.673186 4725 scope.go:117] "RemoveContainer" containerID="2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.710220 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b88c4b73-e22a-4e07-aef6-574b2605a282" (UID: "b88c4b73-e22a-4e07-aef6-574b2605a282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.715343 4725 scope.go:117] "RemoveContainer" containerID="61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03" Feb 25 11:56:53 crc kubenswrapper[4725]: E0225 11:56:53.715910 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03\": container with ID starting with 61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03 not found: ID does not exist" containerID="61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.715961 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03"} err="failed to get container status \"61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03\": rpc error: code = NotFound desc = could not find container \"61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03\": container with ID starting with 61446c2a63fc450e79c88b94a466fa322977cf8188d5420e778ceccf2680cd03 not found: ID does not exist" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.715988 4725 scope.go:117] "RemoveContainer" containerID="8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96" Feb 25 11:56:53 crc kubenswrapper[4725]: E0225 11:56:53.716434 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96\": container with ID starting with 8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96 not found: ID does not exist" containerID="8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.716458 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96"} err="failed to get container status \"8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96\": rpc error: code = NotFound desc = could not find container \"8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96\": container with ID starting with 8b45db38b8c0a7548e68941b4cc12c808e5106f2a60bc016900d548a00e37c96 not found: ID does not exist" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.716496 4725 scope.go:117] "RemoveContainer" containerID="2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f" Feb 25 11:56:53 crc kubenswrapper[4725]: E0225 11:56:53.717672 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f\": container with ID starting with 2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f not found: ID does not exist" containerID="2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.717862 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f"} err="failed to get container status \"2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f\": rpc error: code = NotFound desc = could not find container \"2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f\": container with ID starting with 2b8d38fa8b75ac4ae1333ec972746a844acaf80bcb39f3099814fbbbb4b1471f not found: ID does not exist" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.760277 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88c4b73-e22a-4e07-aef6-574b2605a282-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.820861 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-content/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.828339 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-utilities/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.864061 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-content/0.log" Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.933525 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8c45k"] Feb 25 11:56:53 crc kubenswrapper[4725]: I0225 11:56:53.945543 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8c45k"] Feb 25 11:56:54 crc kubenswrapper[4725]: I0225 11:56:54.112663 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-utilities/0.log" Feb 25 11:56:54 crc kubenswrapper[4725]: I0225 11:56:54.194584 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/registry-server/0.log" Feb 25 11:56:54 crc kubenswrapper[4725]: I0225 11:56:54.276753 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-content/0.log" Feb 25 11:56:54 crc kubenswrapper[4725]: I0225 11:56:54.979409 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-utilities/0.log" Feb 25 11:56:55 crc kubenswrapper[4725]: I0225 11:56:55.171615 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-content/0.log" Feb 25 11:56:55 crc kubenswrapper[4725]: I0225 11:56:55.184556 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-utilities/0.log" Feb 25 11:56:55 crc kubenswrapper[4725]: I0225 11:56:55.193250 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-content/0.log" Feb 25 11:56:55 crc kubenswrapper[4725]: I0225 11:56:55.237109 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" path="/var/lib/kubelet/pods/b88c4b73-e22a-4e07-aef6-574b2605a282/volumes" Feb 25 11:56:55 crc kubenswrapper[4725]: I0225 11:56:55.348724 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-utilities/0.log" Feb 25 11:56:55 crc kubenswrapper[4725]: I0225 11:56:55.378118 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-content/0.log" Feb 25 11:56:55 crc kubenswrapper[4725]: I0225 11:56:55.879009 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/registry-server/0.log" Feb 25 11:56:59 crc kubenswrapper[4725]: I0225 11:56:59.224479 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:56:59 crc kubenswrapper[4725]: E0225 11:56:59.225291 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:57:12 crc kubenswrapper[4725]: I0225 11:57:12.224365 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:57:12 crc kubenswrapper[4725]: E0225 11:57:12.225183 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:57:27 crc kubenswrapper[4725]: I0225 11:57:27.225077 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:57:27 crc kubenswrapper[4725]: E0225 11:57:27.225963 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:57:39 crc kubenswrapper[4725]: I0225 11:57:39.229951 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:57:39 crc kubenswrapper[4725]: E0225 11:57:39.231034 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:57:53 crc kubenswrapper[4725]: I0225 11:57:53.224515 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:57:53 crc kubenswrapper[4725]: E0225 11:57:53.225690 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.155884 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533678-nck22"] Feb 25 11:58:00 crc kubenswrapper[4725]: E0225 11:58:00.157780 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="extract-utilities" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.157890 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="extract-utilities" Feb 25 11:58:00 crc kubenswrapper[4725]: E0225 11:58:00.157983 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="registry-server" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.158038 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="registry-server" Feb 25 11:58:00 crc kubenswrapper[4725]: E0225 11:58:00.158113 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="extract-content" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.158166 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="extract-content" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.158402 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88c4b73-e22a-4e07-aef6-574b2605a282" containerName="registry-server" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.159168 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533678-nck22" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.161400 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.165345 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.167889 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533678-nck22"] Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.175097 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.278384 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zkgq\" (UniqueName: \"kubernetes.io/projected/6aec2150-ee04-432f-b0d3-a8a91d1eca11-kube-api-access-2zkgq\") pod \"auto-csr-approver-29533678-nck22\" (UID: \"6aec2150-ee04-432f-b0d3-a8a91d1eca11\") " pod="openshift-infra/auto-csr-approver-29533678-nck22" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.380555 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zkgq\" (UniqueName: \"kubernetes.io/projected/6aec2150-ee04-432f-b0d3-a8a91d1eca11-kube-api-access-2zkgq\") pod \"auto-csr-approver-29533678-nck22\" (UID: \"6aec2150-ee04-432f-b0d3-a8a91d1eca11\") " pod="openshift-infra/auto-csr-approver-29533678-nck22" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.405240 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zkgq\" (UniqueName: \"kubernetes.io/projected/6aec2150-ee04-432f-b0d3-a8a91d1eca11-kube-api-access-2zkgq\") pod \"auto-csr-approver-29533678-nck22\" (UID: \"6aec2150-ee04-432f-b0d3-a8a91d1eca11\") " pod="openshift-infra/auto-csr-approver-29533678-nck22" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.484386 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533678-nck22" Feb 25 11:58:00 crc kubenswrapper[4725]: I0225 11:58:00.980242 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533678-nck22"] Feb 25 11:58:01 crc kubenswrapper[4725]: I0225 11:58:01.166548 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533678-nck22" event={"ID":"6aec2150-ee04-432f-b0d3-a8a91d1eca11","Type":"ContainerStarted","Data":"fe981a855462cea17c94e2ab28866ff46a19699031eae9669c3136d47040b2af"} Feb 25 11:58:04 crc kubenswrapper[4725]: I0225 11:58:04.197268 4725 generic.go:334] "Generic (PLEG): container finished" podID="6aec2150-ee04-432f-b0d3-a8a91d1eca11" containerID="69ffe4928c4b6e8ae537c96d58a3ab872147a72b0aa87a6e3ec8fc52632da7c4" exitCode=0 Feb 25 11:58:04 crc kubenswrapper[4725]: I0225 11:58:04.197391 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533678-nck22" event={"ID":"6aec2150-ee04-432f-b0d3-a8a91d1eca11","Type":"ContainerDied","Data":"69ffe4928c4b6e8ae537c96d58a3ab872147a72b0aa87a6e3ec8fc52632da7c4"} Feb 25 11:58:05 crc kubenswrapper[4725]: I0225 11:58:05.662168 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533678-nck22" Feb 25 11:58:05 crc kubenswrapper[4725]: I0225 11:58:05.794526 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zkgq\" (UniqueName: \"kubernetes.io/projected/6aec2150-ee04-432f-b0d3-a8a91d1eca11-kube-api-access-2zkgq\") pod \"6aec2150-ee04-432f-b0d3-a8a91d1eca11\" (UID: \"6aec2150-ee04-432f-b0d3-a8a91d1eca11\") " Feb 25 11:58:05 crc kubenswrapper[4725]: I0225 11:58:05.806075 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aec2150-ee04-432f-b0d3-a8a91d1eca11-kube-api-access-2zkgq" (OuterVolumeSpecName: "kube-api-access-2zkgq") pod "6aec2150-ee04-432f-b0d3-a8a91d1eca11" (UID: "6aec2150-ee04-432f-b0d3-a8a91d1eca11"). InnerVolumeSpecName "kube-api-access-2zkgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:58:05 crc kubenswrapper[4725]: I0225 11:58:05.897028 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zkgq\" (UniqueName: \"kubernetes.io/projected/6aec2150-ee04-432f-b0d3-a8a91d1eca11-kube-api-access-2zkgq\") on node \"crc\" DevicePath \"\"" Feb 25 11:58:06 crc kubenswrapper[4725]: I0225 11:58:06.213441 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533678-nck22" event={"ID":"6aec2150-ee04-432f-b0d3-a8a91d1eca11","Type":"ContainerDied","Data":"fe981a855462cea17c94e2ab28866ff46a19699031eae9669c3136d47040b2af"} Feb 25 11:58:06 crc kubenswrapper[4725]: I0225 11:58:06.213479 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe981a855462cea17c94e2ab28866ff46a19699031eae9669c3136d47040b2af" Feb 25 11:58:06 crc kubenswrapper[4725]: I0225 11:58:06.213494 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533678-nck22" Feb 25 11:58:06 crc kubenswrapper[4725]: I0225 11:58:06.741279 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533672-q4ngr"] Feb 25 11:58:06 crc kubenswrapper[4725]: I0225 11:58:06.750423 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533672-q4ngr"] Feb 25 11:58:07 crc kubenswrapper[4725]: I0225 11:58:07.224667 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:58:07 crc kubenswrapper[4725]: E0225 11:58:07.224952 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:58:07 crc kubenswrapper[4725]: I0225 11:58:07.236858 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26eb5fbb-a4eb-44d4-8628-aedbcdae0bff" path="/var/lib/kubelet/pods/26eb5fbb-a4eb-44d4-8628-aedbcdae0bff/volumes" Feb 25 11:58:22 crc kubenswrapper[4725]: I0225 11:58:22.225097 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:58:22 crc kubenswrapper[4725]: E0225 11:58:22.226081 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:58:22 crc kubenswrapper[4725]: I0225 11:58:22.362556 4725 scope.go:117] "RemoveContainer" containerID="246e8dd9fcfe2546bf59e4ae870c86db0d3e194ea742e37cae5875462787707c" Feb 25 11:58:35 crc kubenswrapper[4725]: I0225 11:58:35.232855 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:58:35 crc kubenswrapper[4725]: E0225 11:58:35.233749 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:58:43 crc kubenswrapper[4725]: I0225 11:58:43.598817 4725 generic.go:334] "Generic (PLEG): container finished" podID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerID="3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f" exitCode=0 Feb 25 11:58:43 crc kubenswrapper[4725]: I0225 11:58:43.598909 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" event={"ID":"54161a52-ee5a-492c-ba0b-2d9292adb410","Type":"ContainerDied","Data":"3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f"} Feb 25 11:58:43 crc kubenswrapper[4725]: I0225 11:58:43.600492 4725 scope.go:117] "RemoveContainer" containerID="3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f" Feb 25 11:58:44 crc kubenswrapper[4725]: I0225 11:58:44.267705 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmjfg_must-gather-jp2kj_54161a52-ee5a-492c-ba0b-2d9292adb410/gather/0.log" Feb 25 11:58:49 crc kubenswrapper[4725]: I0225 11:58:49.226857 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:58:49 crc kubenswrapper[4725]: E0225 11:58:49.227908 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:58:52 crc kubenswrapper[4725]: I0225 11:58:52.753308 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmjfg/must-gather-jp2kj"] Feb 25 11:58:52 crc kubenswrapper[4725]: I0225 11:58:52.754207 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerName="copy" containerID="cri-o://5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca" gracePeriod=2 Feb 25 11:58:52 crc kubenswrapper[4725]: I0225 11:58:52.765010 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmjfg/must-gather-jp2kj"] Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.184085 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmjfg_must-gather-jp2kj_54161a52-ee5a-492c-ba0b-2d9292adb410/copy/0.log" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.184882 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.275909 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bqrh\" (UniqueName: \"kubernetes.io/projected/54161a52-ee5a-492c-ba0b-2d9292adb410-kube-api-access-5bqrh\") pod \"54161a52-ee5a-492c-ba0b-2d9292adb410\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.275971 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54161a52-ee5a-492c-ba0b-2d9292adb410-must-gather-output\") pod \"54161a52-ee5a-492c-ba0b-2d9292adb410\" (UID: \"54161a52-ee5a-492c-ba0b-2d9292adb410\") " Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.291154 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54161a52-ee5a-492c-ba0b-2d9292adb410-kube-api-access-5bqrh" (OuterVolumeSpecName: "kube-api-access-5bqrh") pod "54161a52-ee5a-492c-ba0b-2d9292adb410" (UID: "54161a52-ee5a-492c-ba0b-2d9292adb410"). InnerVolumeSpecName "kube-api-access-5bqrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.378315 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bqrh\" (UniqueName: \"kubernetes.io/projected/54161a52-ee5a-492c-ba0b-2d9292adb410-kube-api-access-5bqrh\") on node \"crc\" DevicePath \"\"" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.431394 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54161a52-ee5a-492c-ba0b-2d9292adb410-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "54161a52-ee5a-492c-ba0b-2d9292adb410" (UID: "54161a52-ee5a-492c-ba0b-2d9292adb410"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.479508 4725 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54161a52-ee5a-492c-ba0b-2d9292adb410-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.731596 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmjfg_must-gather-jp2kj_54161a52-ee5a-492c-ba0b-2d9292adb410/copy/0.log" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.732043 4725 generic.go:334] "Generic (PLEG): container finished" podID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerID="5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca" exitCode=143 Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.732102 4725 scope.go:117] "RemoveContainer" containerID="5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.732133 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmjfg/must-gather-jp2kj" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.750925 4725 scope.go:117] "RemoveContainer" containerID="3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.857951 4725 scope.go:117] "RemoveContainer" containerID="5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca" Feb 25 11:58:53 crc kubenswrapper[4725]: E0225 11:58:53.858685 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca\": container with ID starting with 5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca not found: ID does not exist" containerID="5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.858763 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca"} err="failed to get container status \"5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca\": rpc error: code = NotFound desc = could not find container \"5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca\": container with ID starting with 5b517025eb16c393467ecb1df4b3855e27d15873e54d6da751399a72efc080ca not found: ID does not exist" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.858853 4725 scope.go:117] "RemoveContainer" containerID="3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f" Feb 25 11:58:53 crc kubenswrapper[4725]: E0225 11:58:53.859307 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f\": container with ID starting with 3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f not found: ID does not exist" containerID="3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f" Feb 25 11:58:53 crc kubenswrapper[4725]: I0225 11:58:53.859356 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f"} err="failed to get container status \"3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f\": rpc error: code = NotFound desc = could not find container \"3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f\": container with ID starting with 3671998171d85393348774e7e6d6e823a50ece18ec20dcdc948365959980683f not found: ID does not exist" Feb 25 11:58:55 crc kubenswrapper[4725]: I0225 11:58:55.239462 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" path="/var/lib/kubelet/pods/54161a52-ee5a-492c-ba0b-2d9292adb410/volumes" Feb 25 11:59:04 crc kubenswrapper[4725]: I0225 11:59:04.224300 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:59:04 crc kubenswrapper[4725]: E0225 11:59:04.224890 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:59:18 crc kubenswrapper[4725]: I0225 11:59:18.225662 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:59:18 crc kubenswrapper[4725]: E0225 11:59:18.226642 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:59:22 crc kubenswrapper[4725]: I0225 11:59:22.451641 4725 scope.go:117] "RemoveContainer" containerID="55909d9ea9318598e2b35ba0cda0f1f15b1b1c4953ba08213c9c82fa0619d411" Feb 25 11:59:33 crc kubenswrapper[4725]: I0225 11:59:33.224408 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:59:33 crc kubenswrapper[4725]: E0225 11:59:33.225234 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 11:59:45 crc kubenswrapper[4725]: I0225 11:59:45.229679 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 11:59:45 crc kubenswrapper[4725]: E0225 11:59:45.230310 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.162615 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533680-8pgr6"] Feb 25 12:00:00 crc kubenswrapper[4725]: E0225 12:00:00.164350 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerName="copy" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.164373 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerName="copy" Feb 25 12:00:00 crc kubenswrapper[4725]: E0225 12:00:00.164399 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aec2150-ee04-432f-b0d3-a8a91d1eca11" containerName="oc" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.164409 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aec2150-ee04-432f-b0d3-a8a91d1eca11" containerName="oc" Feb 25 12:00:00 crc kubenswrapper[4725]: E0225 12:00:00.164635 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerName="gather" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.164645 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerName="gather" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.164936 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aec2150-ee04-432f-b0d3-a8a91d1eca11" containerName="oc" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.164965 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerName="copy" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.164985 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="54161a52-ee5a-492c-ba0b-2d9292adb410" containerName="gather" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.165953 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533680-8pgr6" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.168224 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.169206 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.169710 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.178606 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh"] Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.181006 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.183104 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.183274 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.195610 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533680-8pgr6"] Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.204490 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh"] Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.228924 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 12:00:00 crc kubenswrapper[4725]: E0225 12:00:00.229259 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.258577 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f536256b-6e58-4525-a5e7-21fe5c8ac119-config-volume\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.258630 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tklxr\" (UniqueName: \"kubernetes.io/projected/f536256b-6e58-4525-a5e7-21fe5c8ac119-kube-api-access-tklxr\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.258935 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f536256b-6e58-4525-a5e7-21fe5c8ac119-secret-volume\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.259398 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fbz\" (UniqueName: \"kubernetes.io/projected/47ad2428-3759-4e3e-bbf5-07b4aab3365c-kube-api-access-d6fbz\") pod \"auto-csr-approver-29533680-8pgr6\" (UID: \"47ad2428-3759-4e3e-bbf5-07b4aab3365c\") " pod="openshift-infra/auto-csr-approver-29533680-8pgr6" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.360666 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fbz\" (UniqueName: \"kubernetes.io/projected/47ad2428-3759-4e3e-bbf5-07b4aab3365c-kube-api-access-d6fbz\") pod \"auto-csr-approver-29533680-8pgr6\" (UID: \"47ad2428-3759-4e3e-bbf5-07b4aab3365c\") " pod="openshift-infra/auto-csr-approver-29533680-8pgr6" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.360730 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f536256b-6e58-4525-a5e7-21fe5c8ac119-config-volume\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.360748 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tklxr\" (UniqueName: \"kubernetes.io/projected/f536256b-6e58-4525-a5e7-21fe5c8ac119-kube-api-access-tklxr\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.360819 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f536256b-6e58-4525-a5e7-21fe5c8ac119-secret-volume\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.361709 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f536256b-6e58-4525-a5e7-21fe5c8ac119-config-volume\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.379478 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f536256b-6e58-4525-a5e7-21fe5c8ac119-secret-volume\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.382509 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fbz\" (UniqueName: \"kubernetes.io/projected/47ad2428-3759-4e3e-bbf5-07b4aab3365c-kube-api-access-d6fbz\") pod \"auto-csr-approver-29533680-8pgr6\" (UID: \"47ad2428-3759-4e3e-bbf5-07b4aab3365c\") " pod="openshift-infra/auto-csr-approver-29533680-8pgr6" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.388657 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tklxr\" (UniqueName: \"kubernetes.io/projected/f536256b-6e58-4525-a5e7-21fe5c8ac119-kube-api-access-tklxr\") pod \"collect-profiles-29533680-zwmzh\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.492768 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533680-8pgr6" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.503625 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.971265 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533680-8pgr6"] Feb 25 12:00:00 crc kubenswrapper[4725]: W0225 12:00:00.979778 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf536256b_6e58_4525_a5e7_21fe5c8ac119.slice/crio-6e4d688b63a1d2708f5ba67743bb2b249a05b68c78b11b4151255070cc09f79d WatchSource:0}: Error finding container 6e4d688b63a1d2708f5ba67743bb2b249a05b68c78b11b4151255070cc09f79d: Status 404 returned error can't find the container with id 6e4d688b63a1d2708f5ba67743bb2b249a05b68c78b11b4151255070cc09f79d Feb 25 12:00:00 crc kubenswrapper[4725]: I0225 12:00:00.993905 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh"] Feb 25 12:00:01 crc kubenswrapper[4725]: I0225 12:00:01.391373 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" event={"ID":"f536256b-6e58-4525-a5e7-21fe5c8ac119","Type":"ContainerStarted","Data":"01498e4b73b12327bd9c3d05a6666e614ffd6b4b084f3393a246d5e6717d1eb5"} Feb 25 12:00:01 crc kubenswrapper[4725]: I0225 12:00:01.391424 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" event={"ID":"f536256b-6e58-4525-a5e7-21fe5c8ac119","Type":"ContainerStarted","Data":"6e4d688b63a1d2708f5ba67743bb2b249a05b68c78b11b4151255070cc09f79d"} Feb 25 12:00:01 crc kubenswrapper[4725]: I0225 12:00:01.411526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533680-8pgr6" event={"ID":"47ad2428-3759-4e3e-bbf5-07b4aab3365c","Type":"ContainerStarted","Data":"28d782fb5b31c1eff7e3efb8e7c6a9983b9ea9de417af7eaddf3bc12e4e62cd8"} Feb 25 12:00:01 crc kubenswrapper[4725]: I0225 12:00:01.426738 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" podStartSLOduration=1.42671969 podStartE2EDuration="1.42671969s" podCreationTimestamp="2026-02-25 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 12:00:01.422413556 +0000 UTC m=+4026.920995581" watchObservedRunningTime="2026-02-25 12:00:01.42671969 +0000 UTC m=+4026.925301715" Feb 25 12:00:02 crc kubenswrapper[4725]: I0225 12:00:02.422816 4725 generic.go:334] "Generic (PLEG): container finished" podID="f536256b-6e58-4525-a5e7-21fe5c8ac119" containerID="01498e4b73b12327bd9c3d05a6666e614ffd6b4b084f3393a246d5e6717d1eb5" exitCode=0 Feb 25 12:00:02 crc kubenswrapper[4725]: I0225 12:00:02.423003 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" event={"ID":"f536256b-6e58-4525-a5e7-21fe5c8ac119","Type":"ContainerDied","Data":"01498e4b73b12327bd9c3d05a6666e614ffd6b4b084f3393a246d5e6717d1eb5"} Feb 25 12:00:03 crc kubenswrapper[4725]: I0225 12:00:03.799447 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:03 crc kubenswrapper[4725]: I0225 12:00:03.937503 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f536256b-6e58-4525-a5e7-21fe5c8ac119-secret-volume\") pod \"f536256b-6e58-4525-a5e7-21fe5c8ac119\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " Feb 25 12:00:03 crc kubenswrapper[4725]: I0225 12:00:03.937596 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f536256b-6e58-4525-a5e7-21fe5c8ac119-config-volume\") pod \"f536256b-6e58-4525-a5e7-21fe5c8ac119\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " Feb 25 12:00:03 crc kubenswrapper[4725]: I0225 12:00:03.937676 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tklxr\" (UniqueName: \"kubernetes.io/projected/f536256b-6e58-4525-a5e7-21fe5c8ac119-kube-api-access-tklxr\") pod \"f536256b-6e58-4525-a5e7-21fe5c8ac119\" (UID: \"f536256b-6e58-4525-a5e7-21fe5c8ac119\") " Feb 25 12:00:03 crc kubenswrapper[4725]: I0225 12:00:03.938493 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f536256b-6e58-4525-a5e7-21fe5c8ac119-config-volume" (OuterVolumeSpecName: "config-volume") pod "f536256b-6e58-4525-a5e7-21fe5c8ac119" (UID: "f536256b-6e58-4525-a5e7-21fe5c8ac119"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 12:00:03 crc kubenswrapper[4725]: I0225 12:00:03.942842 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f536256b-6e58-4525-a5e7-21fe5c8ac119-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f536256b-6e58-4525-a5e7-21fe5c8ac119" (UID: "f536256b-6e58-4525-a5e7-21fe5c8ac119"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 12:00:03 crc kubenswrapper[4725]: I0225 12:00:03.944702 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f536256b-6e58-4525-a5e7-21fe5c8ac119-kube-api-access-tklxr" (OuterVolumeSpecName: "kube-api-access-tklxr") pod "f536256b-6e58-4525-a5e7-21fe5c8ac119" (UID: "f536256b-6e58-4525-a5e7-21fe5c8ac119"). InnerVolumeSpecName "kube-api-access-tklxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.040063 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f536256b-6e58-4525-a5e7-21fe5c8ac119-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.040391 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f536256b-6e58-4525-a5e7-21fe5c8ac119-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.040557 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tklxr\" (UniqueName: \"kubernetes.io/projected/f536256b-6e58-4525-a5e7-21fe5c8ac119-kube-api-access-tklxr\") on node \"crc\" DevicePath \"\"" Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.445326 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" event={"ID":"f536256b-6e58-4525-a5e7-21fe5c8ac119","Type":"ContainerDied","Data":"6e4d688b63a1d2708f5ba67743bb2b249a05b68c78b11b4151255070cc09f79d"} Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.445372 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e4d688b63a1d2708f5ba67743bb2b249a05b68c78b11b4151255070cc09f79d" Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.445397 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533680-zwmzh" Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.497377 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz"] Feb 25 12:00:04 crc kubenswrapper[4725]: I0225 12:00:04.509319 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533635-5jjnz"] Feb 25 12:00:05 crc kubenswrapper[4725]: I0225 12:00:05.239184 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de1f14b-c4f4-4751-8fc6-8d4336738638" path="/var/lib/kubelet/pods/2de1f14b-c4f4-4751-8fc6-8d4336738638/volumes" Feb 25 12:00:05 crc kubenswrapper[4725]: I0225 12:00:05.453965 4725 generic.go:334] "Generic (PLEG): container finished" podID="47ad2428-3759-4e3e-bbf5-07b4aab3365c" containerID="affa07606e90db95ab74b2d7772c4dbddc5867cdf03918fdc27db222bdd91e27" exitCode=0 Feb 25 12:00:05 crc kubenswrapper[4725]: I0225 12:00:05.454014 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533680-8pgr6" event={"ID":"47ad2428-3759-4e3e-bbf5-07b4aab3365c","Type":"ContainerDied","Data":"affa07606e90db95ab74b2d7772c4dbddc5867cdf03918fdc27db222bdd91e27"} Feb 25 12:00:06 crc kubenswrapper[4725]: I0225 12:00:06.784113 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533680-8pgr6" Feb 25 12:00:06 crc kubenswrapper[4725]: I0225 12:00:06.890259 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6fbz\" (UniqueName: \"kubernetes.io/projected/47ad2428-3759-4e3e-bbf5-07b4aab3365c-kube-api-access-d6fbz\") pod \"47ad2428-3759-4e3e-bbf5-07b4aab3365c\" (UID: \"47ad2428-3759-4e3e-bbf5-07b4aab3365c\") " Feb 25 12:00:06 crc kubenswrapper[4725]: I0225 12:00:06.895393 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ad2428-3759-4e3e-bbf5-07b4aab3365c-kube-api-access-d6fbz" (OuterVolumeSpecName: "kube-api-access-d6fbz") pod "47ad2428-3759-4e3e-bbf5-07b4aab3365c" (UID: "47ad2428-3759-4e3e-bbf5-07b4aab3365c"). InnerVolumeSpecName "kube-api-access-d6fbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:00:06 crc kubenswrapper[4725]: I0225 12:00:06.992965 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6fbz\" (UniqueName: \"kubernetes.io/projected/47ad2428-3759-4e3e-bbf5-07b4aab3365c-kube-api-access-d6fbz\") on node \"crc\" DevicePath \"\"" Feb 25 12:00:07 crc kubenswrapper[4725]: I0225 12:00:07.473037 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533680-8pgr6" event={"ID":"47ad2428-3759-4e3e-bbf5-07b4aab3365c","Type":"ContainerDied","Data":"28d782fb5b31c1eff7e3efb8e7c6a9983b9ea9de417af7eaddf3bc12e4e62cd8"} Feb 25 12:00:07 crc kubenswrapper[4725]: I0225 12:00:07.473092 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28d782fb5b31c1eff7e3efb8e7c6a9983b9ea9de417af7eaddf3bc12e4e62cd8" Feb 25 12:00:07 crc kubenswrapper[4725]: I0225 12:00:07.473407 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533680-8pgr6" Feb 25 12:00:07 crc kubenswrapper[4725]: I0225 12:00:07.840730 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533674-mgrk8"] Feb 25 12:00:07 crc kubenswrapper[4725]: I0225 12:00:07.849300 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533674-mgrk8"] Feb 25 12:00:09 crc kubenswrapper[4725]: I0225 12:00:09.236370 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bae8743-62db-4ba4-bbe3-6cdc5faf3fab" path="/var/lib/kubelet/pods/9bae8743-62db-4ba4-bbe3-6cdc5faf3fab/volumes" Feb 25 12:00:13 crc kubenswrapper[4725]: I0225 12:00:13.224998 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 12:00:13 crc kubenswrapper[4725]: I0225 12:00:13.529032 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"b036620e875f4a758dd804181c8957fd14a1029d422786a3424f55fe7e40b96c"} Feb 25 12:00:22 crc kubenswrapper[4725]: I0225 12:00:22.561253 4725 scope.go:117] "RemoveContainer" containerID="a714fb5b0fbbd6eb1c72092440acbefab30f49029438acceed22e82f3f0b6ac3" Feb 25 12:00:22 crc kubenswrapper[4725]: I0225 12:00:22.609245 4725 scope.go:117] "RemoveContainer" containerID="9fe5a01a20e4ff516797de838d1c0d2a3ecb9f82f8f3bf7c7dc8356f978dc27b" Feb 25 12:00:22 crc kubenswrapper[4725]: I0225 12:00:22.626301 4725 scope.go:117] "RemoveContainer" containerID="37760fd477badc1e89b39eb0e039ae82d336e26fdddfb6739b58c03dc4c3e8d6" Feb 25 12:00:22 crc kubenswrapper[4725]: I0225 12:00:22.673003 4725 scope.go:117] "RemoveContainer" containerID="e29609f6451245cb476005c03ddd27d75a8bedf9eabc04e230fd966e7a1f9e12" Feb 25 12:00:22 crc kubenswrapper[4725]: I0225 12:00:22.691006 4725 scope.go:117] "RemoveContainer" containerID="09138f4105649d7882e72851f2813be548d87d6757b9e8dce2f39380a59306b4" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.145078 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29533681-fkv8s"] Feb 25 12:01:00 crc kubenswrapper[4725]: E0225 12:01:00.146028 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f536256b-6e58-4525-a5e7-21fe5c8ac119" containerName="collect-profiles" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.146045 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f536256b-6e58-4525-a5e7-21fe5c8ac119" containerName="collect-profiles" Feb 25 12:01:00 crc kubenswrapper[4725]: E0225 12:01:00.146063 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ad2428-3759-4e3e-bbf5-07b4aab3365c" containerName="oc" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.146069 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ad2428-3759-4e3e-bbf5-07b4aab3365c" containerName="oc" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.146242 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ad2428-3759-4e3e-bbf5-07b4aab3365c" containerName="oc" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.146269 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f536256b-6e58-4525-a5e7-21fe5c8ac119" containerName="collect-profiles" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.146978 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.166153 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29533681-fkv8s"] Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.207884 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-fernet-keys\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.207929 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8tmk\" (UniqueName: \"kubernetes.io/projected/bc1ee72a-eece-401e-9998-5570f2d5db12-kube-api-access-t8tmk\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.208027 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-combined-ca-bundle\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.208066 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-config-data\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.309345 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-fernet-keys\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.309397 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8tmk\" (UniqueName: \"kubernetes.io/projected/bc1ee72a-eece-401e-9998-5570f2d5db12-kube-api-access-t8tmk\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.309459 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-combined-ca-bundle\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.309498 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-config-data\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.315982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-config-data\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.319020 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-combined-ca-bundle\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.336690 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-fernet-keys\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.339017 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8tmk\" (UniqueName: \"kubernetes.io/projected/bc1ee72a-eece-401e-9998-5570f2d5db12-kube-api-access-t8tmk\") pod \"keystone-cron-29533681-fkv8s\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.465081 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:00 crc kubenswrapper[4725]: I0225 12:01:00.956044 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29533681-fkv8s"] Feb 25 12:01:01 crc kubenswrapper[4725]: I0225 12:01:01.041938 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533681-fkv8s" event={"ID":"bc1ee72a-eece-401e-9998-5570f2d5db12","Type":"ContainerStarted","Data":"33e5658967c2fd21b21d029ac563036d5578d10ac02f5d91334d46734fa81f5a"} Feb 25 12:01:02 crc kubenswrapper[4725]: I0225 12:01:02.052630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533681-fkv8s" event={"ID":"bc1ee72a-eece-401e-9998-5570f2d5db12","Type":"ContainerStarted","Data":"f3bba482276ebf6070509acb56076e8fb228542bf01b4d01a4cb9d84057543e2"} Feb 25 12:01:02 crc kubenswrapper[4725]: I0225 12:01:02.070966 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29533681-fkv8s" podStartSLOduration=2.070948332 podStartE2EDuration="2.070948332s" podCreationTimestamp="2026-02-25 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 12:01:02.066433492 +0000 UTC m=+4087.565015527" watchObservedRunningTime="2026-02-25 12:01:02.070948332 +0000 UTC m=+4087.569530357" Feb 25 12:01:07 crc kubenswrapper[4725]: I0225 12:01:07.110727 4725 generic.go:334] "Generic (PLEG): container finished" podID="bc1ee72a-eece-401e-9998-5570f2d5db12" containerID="f3bba482276ebf6070509acb56076e8fb228542bf01b4d01a4cb9d84057543e2" exitCode=0 Feb 25 12:01:07 crc kubenswrapper[4725]: I0225 12:01:07.110932 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533681-fkv8s" event={"ID":"bc1ee72a-eece-401e-9998-5570f2d5db12","Type":"ContainerDied","Data":"f3bba482276ebf6070509acb56076e8fb228542bf01b4d01a4cb9d84057543e2"} Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.515651 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.662351 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-fernet-keys\") pod \"bc1ee72a-eece-401e-9998-5570f2d5db12\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.662490 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-config-data\") pod \"bc1ee72a-eece-401e-9998-5570f2d5db12\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.662550 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-combined-ca-bundle\") pod \"bc1ee72a-eece-401e-9998-5570f2d5db12\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.662597 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8tmk\" (UniqueName: \"kubernetes.io/projected/bc1ee72a-eece-401e-9998-5570f2d5db12-kube-api-access-t8tmk\") pod \"bc1ee72a-eece-401e-9998-5570f2d5db12\" (UID: \"bc1ee72a-eece-401e-9998-5570f2d5db12\") " Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.670310 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bc1ee72a-eece-401e-9998-5570f2d5db12" (UID: "bc1ee72a-eece-401e-9998-5570f2d5db12"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.670331 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1ee72a-eece-401e-9998-5570f2d5db12-kube-api-access-t8tmk" (OuterVolumeSpecName: "kube-api-access-t8tmk") pod "bc1ee72a-eece-401e-9998-5570f2d5db12" (UID: "bc1ee72a-eece-401e-9998-5570f2d5db12"). InnerVolumeSpecName "kube-api-access-t8tmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.699621 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc1ee72a-eece-401e-9998-5570f2d5db12" (UID: "bc1ee72a-eece-401e-9998-5570f2d5db12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.732379 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-config-data" (OuterVolumeSpecName: "config-data") pod "bc1ee72a-eece-401e-9998-5570f2d5db12" (UID: "bc1ee72a-eece-401e-9998-5570f2d5db12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.765260 4725 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.765300 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.765311 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1ee72a-eece-401e-9998-5570f2d5db12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 12:01:08 crc kubenswrapper[4725]: I0225 12:01:08.765320 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8tmk\" (UniqueName: \"kubernetes.io/projected/bc1ee72a-eece-401e-9998-5570f2d5db12-kube-api-access-t8tmk\") on node \"crc\" DevicePath \"\"" Feb 25 12:01:09 crc kubenswrapper[4725]: I0225 12:01:09.129999 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533681-fkv8s" event={"ID":"bc1ee72a-eece-401e-9998-5570f2d5db12","Type":"ContainerDied","Data":"33e5658967c2fd21b21d029ac563036d5578d10ac02f5d91334d46734fa81f5a"} Feb 25 12:01:09 crc kubenswrapper[4725]: I0225 12:01:09.130346 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33e5658967c2fd21b21d029ac563036d5578d10ac02f5d91334d46734fa81f5a" Feb 25 12:01:09 crc kubenswrapper[4725]: I0225 12:01:09.130072 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533681-fkv8s" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.173471 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f6f6w"] Feb 25 12:01:38 crc kubenswrapper[4725]: E0225 12:01:38.174518 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1ee72a-eece-401e-9998-5570f2d5db12" containerName="keystone-cron" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.174536 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1ee72a-eece-401e-9998-5570f2d5db12" containerName="keystone-cron" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.174782 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1ee72a-eece-401e-9998-5570f2d5db12" containerName="keystone-cron" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.176467 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.190749 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6f6w"] Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.300881 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v2gr\" (UniqueName: \"kubernetes.io/projected/fc64a848-44bd-4ba0-84f2-182dbd0411ad-kube-api-access-5v2gr\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.301699 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-utilities\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.301847 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-catalog-content\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.403058 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v2gr\" (UniqueName: \"kubernetes.io/projected/fc64a848-44bd-4ba0-84f2-182dbd0411ad-kube-api-access-5v2gr\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.403174 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-utilities\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.403203 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-catalog-content\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.404184 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-catalog-content\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.404284 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-utilities\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.483132 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v2gr\" (UniqueName: \"kubernetes.io/projected/fc64a848-44bd-4ba0-84f2-182dbd0411ad-kube-api-access-5v2gr\") pod \"community-operators-f6f6w\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.498886 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:38 crc kubenswrapper[4725]: I0225 12:01:38.981753 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6f6w"] Feb 25 12:01:39 crc kubenswrapper[4725]: I0225 12:01:39.452669 4725 generic.go:334] "Generic (PLEG): container finished" podID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerID="9c86e260d8620a12faac240fb65889d862e71728642a45ab5c876038ae5e279b" exitCode=0 Feb 25 12:01:39 crc kubenswrapper[4725]: I0225 12:01:39.452731 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f6w" event={"ID":"fc64a848-44bd-4ba0-84f2-182dbd0411ad","Type":"ContainerDied","Data":"9c86e260d8620a12faac240fb65889d862e71728642a45ab5c876038ae5e279b"} Feb 25 12:01:39 crc kubenswrapper[4725]: I0225 12:01:39.452794 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f6w" event={"ID":"fc64a848-44bd-4ba0-84f2-182dbd0411ad","Type":"ContainerStarted","Data":"27a6cc3e04c98091e06752f99840b99c7313f081d9161c05f10d53535ee27393"} Feb 25 12:01:39 crc kubenswrapper[4725]: I0225 12:01:39.455534 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 12:01:41 crc kubenswrapper[4725]: I0225 12:01:41.478779 4725 generic.go:334] "Generic (PLEG): container finished" podID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerID="ffec85d015b9db13f1855e940a79d65632be38c51e6adb5476af43c053aa1fcf" exitCode=0 Feb 25 12:01:41 crc kubenswrapper[4725]: I0225 12:01:41.478955 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f6w" event={"ID":"fc64a848-44bd-4ba0-84f2-182dbd0411ad","Type":"ContainerDied","Data":"ffec85d015b9db13f1855e940a79d65632be38c51e6adb5476af43c053aa1fcf"} Feb 25 12:01:42 crc kubenswrapper[4725]: I0225 12:01:42.490449 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f6w" event={"ID":"fc64a848-44bd-4ba0-84f2-182dbd0411ad","Type":"ContainerStarted","Data":"28a1c5a50baa6df28d366961a9e3b1720c250b4fb3cf493a8e24defaf397ba20"} Feb 25 12:01:42 crc kubenswrapper[4725]: I0225 12:01:42.516916 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f6f6w" podStartSLOduration=2.101692941 podStartE2EDuration="4.516893181s" podCreationTimestamp="2026-02-25 12:01:38 +0000 UTC" firstStartedPulling="2026-02-25 12:01:39.455300315 +0000 UTC m=+4124.953882340" lastFinishedPulling="2026-02-25 12:01:41.870500555 +0000 UTC m=+4127.369082580" observedRunningTime="2026-02-25 12:01:42.508335894 +0000 UTC m=+4128.006917929" watchObservedRunningTime="2026-02-25 12:01:42.516893181 +0000 UTC m=+4128.015475216" Feb 25 12:01:48 crc kubenswrapper[4725]: I0225 12:01:48.499992 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:48 crc kubenswrapper[4725]: I0225 12:01:48.503418 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:48 crc kubenswrapper[4725]: I0225 12:01:48.550343 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:48 crc kubenswrapper[4725]: I0225 12:01:48.600988 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:48 crc kubenswrapper[4725]: I0225 12:01:48.796280 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f6f6w"] Feb 25 12:01:50 crc kubenswrapper[4725]: I0225 12:01:50.565102 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f6f6w" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="registry-server" containerID="cri-o://28a1c5a50baa6df28d366961a9e3b1720c250b4fb3cf493a8e24defaf397ba20" gracePeriod=2 Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.159750 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ghxvx/must-gather-4jxd6"] Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.168974 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.178059 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ghxvx"/"openshift-service-ca.crt" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.178068 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ghxvx"/"kube-root-ca.crt" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.187906 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ghxvx/must-gather-4jxd6"] Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.265768 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ll2x\" (UniqueName: \"kubernetes.io/projected/b395cf46-cc41-434e-be61-6104918005b0-kube-api-access-8ll2x\") pod \"must-gather-4jxd6\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.266124 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b395cf46-cc41-434e-be61-6104918005b0-must-gather-output\") pod \"must-gather-4jxd6\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.368370 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ll2x\" (UniqueName: \"kubernetes.io/projected/b395cf46-cc41-434e-be61-6104918005b0-kube-api-access-8ll2x\") pod \"must-gather-4jxd6\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.368521 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b395cf46-cc41-434e-be61-6104918005b0-must-gather-output\") pod \"must-gather-4jxd6\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.368961 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b395cf46-cc41-434e-be61-6104918005b0-must-gather-output\") pod \"must-gather-4jxd6\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.394526 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ll2x\" (UniqueName: \"kubernetes.io/projected/b395cf46-cc41-434e-be61-6104918005b0-kube-api-access-8ll2x\") pod \"must-gather-4jxd6\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.501088 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.586455 4725 generic.go:334] "Generic (PLEG): container finished" podID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerID="28a1c5a50baa6df28d366961a9e3b1720c250b4fb3cf493a8e24defaf397ba20" exitCode=0 Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.586719 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f6w" event={"ID":"fc64a848-44bd-4ba0-84f2-182dbd0411ad","Type":"ContainerDied","Data":"28a1c5a50baa6df28d366961a9e3b1720c250b4fb3cf493a8e24defaf397ba20"} Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.586749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f6w" event={"ID":"fc64a848-44bd-4ba0-84f2-182dbd0411ad","Type":"ContainerDied","Data":"27a6cc3e04c98091e06752f99840b99c7313f081d9161c05f10d53535ee27393"} Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.586763 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27a6cc3e04c98091e06752f99840b99c7313f081d9161c05f10d53535ee27393" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.655496 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.779526 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v2gr\" (UniqueName: \"kubernetes.io/projected/fc64a848-44bd-4ba0-84f2-182dbd0411ad-kube-api-access-5v2gr\") pod \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.779736 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-catalog-content\") pod \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.779772 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-utilities\") pod \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\" (UID: \"fc64a848-44bd-4ba0-84f2-182dbd0411ad\") " Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.783582 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-utilities" (OuterVolumeSpecName: "utilities") pod "fc64a848-44bd-4ba0-84f2-182dbd0411ad" (UID: "fc64a848-44bd-4ba0-84f2-182dbd0411ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.791044 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc64a848-44bd-4ba0-84f2-182dbd0411ad-kube-api-access-5v2gr" (OuterVolumeSpecName: "kube-api-access-5v2gr") pod "fc64a848-44bd-4ba0-84f2-182dbd0411ad" (UID: "fc64a848-44bd-4ba0-84f2-182dbd0411ad"). InnerVolumeSpecName "kube-api-access-5v2gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.847327 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc64a848-44bd-4ba0-84f2-182dbd0411ad" (UID: "fc64a848-44bd-4ba0-84f2-182dbd0411ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.882444 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.882477 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc64a848-44bd-4ba0-84f2-182dbd0411ad-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.882486 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v2gr\" (UniqueName: \"kubernetes.io/projected/fc64a848-44bd-4ba0-84f2-182dbd0411ad-kube-api-access-5v2gr\") on node \"crc\" DevicePath \"\"" Feb 25 12:01:51 crc kubenswrapper[4725]: I0225 12:01:51.977428 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ghxvx/must-gather-4jxd6"] Feb 25 12:01:52 crc kubenswrapper[4725]: I0225 12:01:52.595480 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" event={"ID":"b395cf46-cc41-434e-be61-6104918005b0","Type":"ContainerStarted","Data":"fe1438b85acf248d20982bfce88662408bf06cc768d9c4ae7531379e519f5a42"} Feb 25 12:01:52 crc kubenswrapper[4725]: I0225 12:01:52.595522 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6f6w" Feb 25 12:01:52 crc kubenswrapper[4725]: I0225 12:01:52.626785 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f6f6w"] Feb 25 12:01:52 crc kubenswrapper[4725]: I0225 12:01:52.638529 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f6f6w"] Feb 25 12:01:53 crc kubenswrapper[4725]: I0225 12:01:53.234453 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" path="/var/lib/kubelet/pods/fc64a848-44bd-4ba0-84f2-182dbd0411ad/volumes" Feb 25 12:01:53 crc kubenswrapper[4725]: I0225 12:01:53.610621 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" event={"ID":"b395cf46-cc41-434e-be61-6104918005b0","Type":"ContainerStarted","Data":"97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d"} Feb 25 12:01:53 crc kubenswrapper[4725]: I0225 12:01:53.610676 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" event={"ID":"b395cf46-cc41-434e-be61-6104918005b0","Type":"ContainerStarted","Data":"3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c"} Feb 25 12:01:53 crc kubenswrapper[4725]: I0225 12:01:53.639960 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" podStartSLOduration=2.6399365120000002 podStartE2EDuration="2.639936512s" podCreationTimestamp="2026-02-25 12:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 12:01:53.633798659 +0000 UTC m=+4139.132380704" watchObservedRunningTime="2026-02-25 12:01:53.639936512 +0000 UTC m=+4139.138518537" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.463599 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-gzsdm"] Feb 25 12:01:56 crc kubenswrapper[4725]: E0225 12:01:56.465796 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="extract-utilities" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.465921 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="extract-utilities" Feb 25 12:01:56 crc kubenswrapper[4725]: E0225 12:01:56.466007 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="registry-server" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.466095 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="registry-server" Feb 25 12:01:56 crc kubenswrapper[4725]: E0225 12:01:56.466207 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="extract-content" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.466290 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="extract-content" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.466608 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc64a848-44bd-4ba0-84f2-182dbd0411ad" containerName="registry-server" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.467449 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.470012 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ghxvx"/"default-dockercfg-579rj" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.593381 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0292f8f9-0675-41b7-8618-690f67796019-host\") pod \"crc-debug-gzsdm\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.593770 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6hjq\" (UniqueName: \"kubernetes.io/projected/0292f8f9-0675-41b7-8618-690f67796019-kube-api-access-n6hjq\") pod \"crc-debug-gzsdm\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.696290 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0292f8f9-0675-41b7-8618-690f67796019-host\") pod \"crc-debug-gzsdm\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.696408 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6hjq\" (UniqueName: \"kubernetes.io/projected/0292f8f9-0675-41b7-8618-690f67796019-kube-api-access-n6hjq\") pod \"crc-debug-gzsdm\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.696442 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0292f8f9-0675-41b7-8618-690f67796019-host\") pod \"crc-debug-gzsdm\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.716617 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6hjq\" (UniqueName: \"kubernetes.io/projected/0292f8f9-0675-41b7-8618-690f67796019-kube-api-access-n6hjq\") pod \"crc-debug-gzsdm\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: I0225 12:01:56.784348 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:01:56 crc kubenswrapper[4725]: W0225 12:01:56.811595 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0292f8f9_0675_41b7_8618_690f67796019.slice/crio-f92f0d59190d4105bef7d7899ce32685e448971f9060f6123309a100de8def29 WatchSource:0}: Error finding container f92f0d59190d4105bef7d7899ce32685e448971f9060f6123309a100de8def29: Status 404 returned error can't find the container with id f92f0d59190d4105bef7d7899ce32685e448971f9060f6123309a100de8def29 Feb 25 12:01:57 crc kubenswrapper[4725]: I0225 12:01:57.664577 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" event={"ID":"0292f8f9-0675-41b7-8618-690f67796019","Type":"ContainerStarted","Data":"ca7b9918529af83160ae620b32d7cb9ddd040a5b06e47c059dce99c2bada939d"} Feb 25 12:01:57 crc kubenswrapper[4725]: I0225 12:01:57.665015 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" event={"ID":"0292f8f9-0675-41b7-8618-690f67796019","Type":"ContainerStarted","Data":"f92f0d59190d4105bef7d7899ce32685e448971f9060f6123309a100de8def29"} Feb 25 12:01:57 crc kubenswrapper[4725]: I0225 12:01:57.721275 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" podStartSLOduration=1.7212127050000001 podStartE2EDuration="1.721212705s" podCreationTimestamp="2026-02-25 12:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 12:01:57.683856034 +0000 UTC m=+4143.182438079" watchObservedRunningTime="2026-02-25 12:01:57.721212705 +0000 UTC m=+4143.219794760" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.142635 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533682-nsxgv"] Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.145399 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533682-nsxgv" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.148687 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.148960 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.149133 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.153046 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533682-nsxgv"] Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.159760 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knb94\" (UniqueName: \"kubernetes.io/projected/a075f866-c9c5-4c21-b642-570305f1cbd7-kube-api-access-knb94\") pod \"auto-csr-approver-29533682-nsxgv\" (UID: \"a075f866-c9c5-4c21-b642-570305f1cbd7\") " pod="openshift-infra/auto-csr-approver-29533682-nsxgv" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.261476 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knb94\" (UniqueName: \"kubernetes.io/projected/a075f866-c9c5-4c21-b642-570305f1cbd7-kube-api-access-knb94\") pod \"auto-csr-approver-29533682-nsxgv\" (UID: \"a075f866-c9c5-4c21-b642-570305f1cbd7\") " pod="openshift-infra/auto-csr-approver-29533682-nsxgv" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.684747 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knb94\" (UniqueName: \"kubernetes.io/projected/a075f866-c9c5-4c21-b642-570305f1cbd7-kube-api-access-knb94\") pod \"auto-csr-approver-29533682-nsxgv\" (UID: \"a075f866-c9c5-4c21-b642-570305f1cbd7\") " pod="openshift-infra/auto-csr-approver-29533682-nsxgv" Feb 25 12:02:00 crc kubenswrapper[4725]: I0225 12:02:00.771132 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533682-nsxgv" Feb 25 12:02:01 crc kubenswrapper[4725]: I0225 12:02:01.298548 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533682-nsxgv"] Feb 25 12:02:01 crc kubenswrapper[4725]: I0225 12:02:01.712370 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533682-nsxgv" event={"ID":"a075f866-c9c5-4c21-b642-570305f1cbd7","Type":"ContainerStarted","Data":"98d7b5890b9c21cfe57a0fbdf15cfd483c09e2aebc0b07a387b2fc722f7aaee2"} Feb 25 12:02:03 crc kubenswrapper[4725]: I0225 12:02:03.731778 4725 generic.go:334] "Generic (PLEG): container finished" podID="a075f866-c9c5-4c21-b642-570305f1cbd7" containerID="9ac2f59219692a3d9243cfcce0deb70d8b971e54af453ceeac445fa6fb8c6e52" exitCode=0 Feb 25 12:02:03 crc kubenswrapper[4725]: I0225 12:02:03.731839 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533682-nsxgv" event={"ID":"a075f866-c9c5-4c21-b642-570305f1cbd7","Type":"ContainerDied","Data":"9ac2f59219692a3d9243cfcce0deb70d8b971e54af453ceeac445fa6fb8c6e52"} Feb 25 12:02:05 crc kubenswrapper[4725]: I0225 12:02:05.170915 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533682-nsxgv" Feb 25 12:02:05 crc kubenswrapper[4725]: I0225 12:02:05.355731 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knb94\" (UniqueName: \"kubernetes.io/projected/a075f866-c9c5-4c21-b642-570305f1cbd7-kube-api-access-knb94\") pod \"a075f866-c9c5-4c21-b642-570305f1cbd7\" (UID: \"a075f866-c9c5-4c21-b642-570305f1cbd7\") " Feb 25 12:02:05 crc kubenswrapper[4725]: I0225 12:02:05.363178 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a075f866-c9c5-4c21-b642-570305f1cbd7-kube-api-access-knb94" (OuterVolumeSpecName: "kube-api-access-knb94") pod "a075f866-c9c5-4c21-b642-570305f1cbd7" (UID: "a075f866-c9c5-4c21-b642-570305f1cbd7"). InnerVolumeSpecName "kube-api-access-knb94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:02:05 crc kubenswrapper[4725]: I0225 12:02:05.458230 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knb94\" (UniqueName: \"kubernetes.io/projected/a075f866-c9c5-4c21-b642-570305f1cbd7-kube-api-access-knb94\") on node \"crc\" DevicePath \"\"" Feb 25 12:02:05 crc kubenswrapper[4725]: I0225 12:02:05.749376 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533682-nsxgv" event={"ID":"a075f866-c9c5-4c21-b642-570305f1cbd7","Type":"ContainerDied","Data":"98d7b5890b9c21cfe57a0fbdf15cfd483c09e2aebc0b07a387b2fc722f7aaee2"} Feb 25 12:02:05 crc kubenswrapper[4725]: I0225 12:02:05.749413 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d7b5890b9c21cfe57a0fbdf15cfd483c09e2aebc0b07a387b2fc722f7aaee2" Feb 25 12:02:05 crc kubenswrapper[4725]: I0225 12:02:05.749460 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533682-nsxgv" Feb 25 12:02:06 crc kubenswrapper[4725]: I0225 12:02:06.242159 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533676-9qm58"] Feb 25 12:02:06 crc kubenswrapper[4725]: I0225 12:02:06.249583 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533676-9qm58"] Feb 25 12:02:07 crc kubenswrapper[4725]: I0225 12:02:07.237355 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720b297c-f3c3-4eac-8997-56451b4a2427" path="/var/lib/kubelet/pods/720b297c-f3c3-4eac-8997-56451b4a2427/volumes" Feb 25 12:02:22 crc kubenswrapper[4725]: I0225 12:02:22.838398 4725 scope.go:117] "RemoveContainer" containerID="f21e36498f6fad3fc49c5e0bdbd75ed28aab7fb6b98bfae792e34203bab2f537" Feb 25 12:02:31 crc kubenswrapper[4725]: I0225 12:02:31.993391 4725 generic.go:334] "Generic (PLEG): container finished" podID="0292f8f9-0675-41b7-8618-690f67796019" containerID="ca7b9918529af83160ae620b32d7cb9ddd040a5b06e47c059dce99c2bada939d" exitCode=0 Feb 25 12:02:31 crc kubenswrapper[4725]: I0225 12:02:31.993465 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" event={"ID":"0292f8f9-0675-41b7-8618-690f67796019","Type":"ContainerDied","Data":"ca7b9918529af83160ae620b32d7cb9ddd040a5b06e47c059dce99c2bada939d"} Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.107764 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.136154 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-gzsdm"] Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.144196 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-gzsdm"] Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.285387 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0292f8f9-0675-41b7-8618-690f67796019-host\") pod \"0292f8f9-0675-41b7-8618-690f67796019\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.285507 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0292f8f9-0675-41b7-8618-690f67796019-host" (OuterVolumeSpecName: "host") pod "0292f8f9-0675-41b7-8618-690f67796019" (UID: "0292f8f9-0675-41b7-8618-690f67796019"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.285568 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6hjq\" (UniqueName: \"kubernetes.io/projected/0292f8f9-0675-41b7-8618-690f67796019-kube-api-access-n6hjq\") pod \"0292f8f9-0675-41b7-8618-690f67796019\" (UID: \"0292f8f9-0675-41b7-8618-690f67796019\") " Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.286052 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0292f8f9-0675-41b7-8618-690f67796019-host\") on node \"crc\" DevicePath \"\"" Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.292923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0292f8f9-0675-41b7-8618-690f67796019-kube-api-access-n6hjq" (OuterVolumeSpecName: "kube-api-access-n6hjq") pod "0292f8f9-0675-41b7-8618-690f67796019" (UID: "0292f8f9-0675-41b7-8618-690f67796019"). InnerVolumeSpecName "kube-api-access-n6hjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:02:33 crc kubenswrapper[4725]: I0225 12:02:33.388461 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6hjq\" (UniqueName: \"kubernetes.io/projected/0292f8f9-0675-41b7-8618-690f67796019-kube-api-access-n6hjq\") on node \"crc\" DevicePath \"\"" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.019242 4725 scope.go:117] "RemoveContainer" containerID="ca7b9918529af83160ae620b32d7cb9ddd040a5b06e47c059dce99c2bada939d" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.019278 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-gzsdm" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.406975 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-d49kv"] Feb 25 12:02:34 crc kubenswrapper[4725]: E0225 12:02:34.407355 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0292f8f9-0675-41b7-8618-690f67796019" containerName="container-00" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.407367 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0292f8f9-0675-41b7-8618-690f67796019" containerName="container-00" Feb 25 12:02:34 crc kubenswrapper[4725]: E0225 12:02:34.407380 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a075f866-c9c5-4c21-b642-570305f1cbd7" containerName="oc" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.407386 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a075f866-c9c5-4c21-b642-570305f1cbd7" containerName="oc" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.408541 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a075f866-c9c5-4c21-b642-570305f1cbd7" containerName="oc" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.408581 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0292f8f9-0675-41b7-8618-690f67796019" containerName="container-00" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.409718 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.425519 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ghxvx"/"default-dockercfg-579rj" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.508368 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327a6cd-a070-498f-b04a-c03af8380401-host\") pod \"crc-debug-d49kv\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.508484 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxs2\" (UniqueName: \"kubernetes.io/projected/0327a6cd-a070-498f-b04a-c03af8380401-kube-api-access-vhxs2\") pod \"crc-debug-d49kv\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.610684 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327a6cd-a070-498f-b04a-c03af8380401-host\") pod \"crc-debug-d49kv\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.610746 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxs2\" (UniqueName: \"kubernetes.io/projected/0327a6cd-a070-498f-b04a-c03af8380401-kube-api-access-vhxs2\") pod \"crc-debug-d49kv\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.611016 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327a6cd-a070-498f-b04a-c03af8380401-host\") pod \"crc-debug-d49kv\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.634914 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxs2\" (UniqueName: \"kubernetes.io/projected/0327a6cd-a070-498f-b04a-c03af8380401-kube-api-access-vhxs2\") pod \"crc-debug-d49kv\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:34 crc kubenswrapper[4725]: I0225 12:02:34.749073 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:35 crc kubenswrapper[4725]: I0225 12:02:35.031940 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/crc-debug-d49kv" event={"ID":"0327a6cd-a070-498f-b04a-c03af8380401","Type":"ContainerStarted","Data":"f10439adfe66ae7b8e74fb558abbfc1e08037025749c627842172445c40b7cdf"} Feb 25 12:02:35 crc kubenswrapper[4725]: I0225 12:02:35.236877 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0292f8f9-0675-41b7-8618-690f67796019" path="/var/lib/kubelet/pods/0292f8f9-0675-41b7-8618-690f67796019/volumes" Feb 25 12:02:36 crc kubenswrapper[4725]: I0225 12:02:36.046753 4725 generic.go:334] "Generic (PLEG): container finished" podID="0327a6cd-a070-498f-b04a-c03af8380401" containerID="11c46d18c6958055da08f138fa49f36c2c8b93a3ec656b9016924e7fc24dc411" exitCode=0 Feb 25 12:02:36 crc kubenswrapper[4725]: I0225 12:02:36.046801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/crc-debug-d49kv" event={"ID":"0327a6cd-a070-498f-b04a-c03af8380401","Type":"ContainerDied","Data":"11c46d18c6958055da08f138fa49f36c2c8b93a3ec656b9016924e7fc24dc411"} Feb 25 12:02:36 crc kubenswrapper[4725]: I0225 12:02:36.531443 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-d49kv"] Feb 25 12:02:36 crc kubenswrapper[4725]: I0225 12:02:36.544911 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-d49kv"] Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.292711 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.458270 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhxs2\" (UniqueName: \"kubernetes.io/projected/0327a6cd-a070-498f-b04a-c03af8380401-kube-api-access-vhxs2\") pod \"0327a6cd-a070-498f-b04a-c03af8380401\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.458723 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327a6cd-a070-498f-b04a-c03af8380401-host\") pod \"0327a6cd-a070-498f-b04a-c03af8380401\" (UID: \"0327a6cd-a070-498f-b04a-c03af8380401\") " Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.459243 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0327a6cd-a070-498f-b04a-c03af8380401-host" (OuterVolumeSpecName: "host") pod "0327a6cd-a070-498f-b04a-c03af8380401" (UID: "0327a6cd-a070-498f-b04a-c03af8380401"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.467223 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0327a6cd-a070-498f-b04a-c03af8380401-kube-api-access-vhxs2" (OuterVolumeSpecName: "kube-api-access-vhxs2") pod "0327a6cd-a070-498f-b04a-c03af8380401" (UID: "0327a6cd-a070-498f-b04a-c03af8380401"). InnerVolumeSpecName "kube-api-access-vhxs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.561031 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhxs2\" (UniqueName: \"kubernetes.io/projected/0327a6cd-a070-498f-b04a-c03af8380401-kube-api-access-vhxs2\") on node \"crc\" DevicePath \"\"" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.561074 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327a6cd-a070-498f-b04a-c03af8380401-host\") on node \"crc\" DevicePath \"\"" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.735790 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-fsqpc"] Feb 25 12:02:37 crc kubenswrapper[4725]: E0225 12:02:37.736222 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0327a6cd-a070-498f-b04a-c03af8380401" containerName="container-00" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.736242 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0327a6cd-a070-498f-b04a-c03af8380401" containerName="container-00" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.736442 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0327a6cd-a070-498f-b04a-c03af8380401" containerName="container-00" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.737041 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.865368 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrx5\" (UniqueName: \"kubernetes.io/projected/56d79abb-8de6-4166-812b-00b659f308cc-kube-api-access-vvrx5\") pod \"crc-debug-fsqpc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.865477 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56d79abb-8de6-4166-812b-00b659f308cc-host\") pod \"crc-debug-fsqpc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.967571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrx5\" (UniqueName: \"kubernetes.io/projected/56d79abb-8de6-4166-812b-00b659f308cc-kube-api-access-vvrx5\") pod \"crc-debug-fsqpc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.967717 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56d79abb-8de6-4166-812b-00b659f308cc-host\") pod \"crc-debug-fsqpc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.967908 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56d79abb-8de6-4166-812b-00b659f308cc-host\") pod \"crc-debug-fsqpc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:37 crc kubenswrapper[4725]: I0225 12:02:37.986966 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrx5\" (UniqueName: \"kubernetes.io/projected/56d79abb-8de6-4166-812b-00b659f308cc-kube-api-access-vvrx5\") pod \"crc-debug-fsqpc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:38 crc kubenswrapper[4725]: I0225 12:02:38.052450 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:38 crc kubenswrapper[4725]: I0225 12:02:38.074540 4725 scope.go:117] "RemoveContainer" containerID="11c46d18c6958055da08f138fa49f36c2c8b93a3ec656b9016924e7fc24dc411" Feb 25 12:02:38 crc kubenswrapper[4725]: I0225 12:02:38.074938 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-d49kv" Feb 25 12:02:38 crc kubenswrapper[4725]: W0225 12:02:38.095751 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d79abb_8de6_4166_812b_00b659f308cc.slice/crio-cabe0e7f1c929d40b4e71c898a5f5dc367269e0a5dd3aaff4604e16584b5dc91 WatchSource:0}: Error finding container cabe0e7f1c929d40b4e71c898a5f5dc367269e0a5dd3aaff4604e16584b5dc91: Status 404 returned error can't find the container with id cabe0e7f1c929d40b4e71c898a5f5dc367269e0a5dd3aaff4604e16584b5dc91 Feb 25 12:02:39 crc kubenswrapper[4725]: I0225 12:02:39.088184 4725 generic.go:334] "Generic (PLEG): container finished" podID="56d79abb-8de6-4166-812b-00b659f308cc" containerID="49f22eb0f1c57befe26bc6767d7d9934325250a5d9dc6e1322500dd59a6cd9e2" exitCode=0 Feb 25 12:02:39 crc kubenswrapper[4725]: I0225 12:02:39.088287 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" event={"ID":"56d79abb-8de6-4166-812b-00b659f308cc","Type":"ContainerDied","Data":"49f22eb0f1c57befe26bc6767d7d9934325250a5d9dc6e1322500dd59a6cd9e2"} Feb 25 12:02:39 crc kubenswrapper[4725]: I0225 12:02:39.088460 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" event={"ID":"56d79abb-8de6-4166-812b-00b659f308cc","Type":"ContainerStarted","Data":"cabe0e7f1c929d40b4e71c898a5f5dc367269e0a5dd3aaff4604e16584b5dc91"} Feb 25 12:02:39 crc kubenswrapper[4725]: I0225 12:02:39.131267 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-fsqpc"] Feb 25 12:02:39 crc kubenswrapper[4725]: I0225 12:02:39.139486 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ghxvx/crc-debug-fsqpc"] Feb 25 12:02:39 crc kubenswrapper[4725]: I0225 12:02:39.234581 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0327a6cd-a070-498f-b04a-c03af8380401" path="/var/lib/kubelet/pods/0327a6cd-a070-498f-b04a-c03af8380401/volumes" Feb 25 12:02:40 crc kubenswrapper[4725]: I0225 12:02:40.194884 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:40 crc kubenswrapper[4725]: I0225 12:02:40.312247 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56d79abb-8de6-4166-812b-00b659f308cc-host\") pod \"56d79abb-8de6-4166-812b-00b659f308cc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " Feb 25 12:02:40 crc kubenswrapper[4725]: I0225 12:02:40.312322 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvrx5\" (UniqueName: \"kubernetes.io/projected/56d79abb-8de6-4166-812b-00b659f308cc-kube-api-access-vvrx5\") pod \"56d79abb-8de6-4166-812b-00b659f308cc\" (UID: \"56d79abb-8de6-4166-812b-00b659f308cc\") " Feb 25 12:02:40 crc kubenswrapper[4725]: I0225 12:02:40.312455 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56d79abb-8de6-4166-812b-00b659f308cc-host" (OuterVolumeSpecName: "host") pod "56d79abb-8de6-4166-812b-00b659f308cc" (UID: "56d79abb-8de6-4166-812b-00b659f308cc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 12:02:40 crc kubenswrapper[4725]: I0225 12:02:40.312784 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56d79abb-8de6-4166-812b-00b659f308cc-host\") on node \"crc\" DevicePath \"\"" Feb 25 12:02:40 crc kubenswrapper[4725]: I0225 12:02:40.317947 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d79abb-8de6-4166-812b-00b659f308cc-kube-api-access-vvrx5" (OuterVolumeSpecName: "kube-api-access-vvrx5") pod "56d79abb-8de6-4166-812b-00b659f308cc" (UID: "56d79abb-8de6-4166-812b-00b659f308cc"). InnerVolumeSpecName "kube-api-access-vvrx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:02:40 crc kubenswrapper[4725]: I0225 12:02:40.414683 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvrx5\" (UniqueName: \"kubernetes.io/projected/56d79abb-8de6-4166-812b-00b659f308cc-kube-api-access-vvrx5\") on node \"crc\" DevicePath \"\"" Feb 25 12:02:41 crc kubenswrapper[4725]: I0225 12:02:41.105999 4725 scope.go:117] "RemoveContainer" containerID="49f22eb0f1c57befe26bc6767d7d9934325250a5d9dc6e1322500dd59a6cd9e2" Feb 25 12:02:41 crc kubenswrapper[4725]: I0225 12:02:41.106033 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/crc-debug-fsqpc" Feb 25 12:02:41 crc kubenswrapper[4725]: I0225 12:02:41.234636 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d79abb-8de6-4166-812b-00b659f308cc" path="/var/lib/kubelet/pods/56d79abb-8de6-4166-812b-00b659f308cc/volumes" Feb 25 12:02:41 crc kubenswrapper[4725]: I0225 12:02:41.555944 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 12:02:41 crc kubenswrapper[4725]: I0225 12:02:41.556002 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 12:03:09 crc kubenswrapper[4725]: I0225 12:03:09.575960 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84dc96ccc8-zhwrq_b4ab7d45-3a36-4ffc-9004-62ff70fbfe53/barbican-api-log/0.log" Feb 25 12:03:09 crc kubenswrapper[4725]: I0225 12:03:09.584176 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84dc96ccc8-zhwrq_b4ab7d45-3a36-4ffc-9004-62ff70fbfe53/barbican-api/0.log" Feb 25 12:03:09 crc kubenswrapper[4725]: I0225 12:03:09.735670 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b8b9cdb6b-d9zj4_b77182d3-74cf-4a61-a3a1-81efff62da8d/barbican-keystone-listener/0.log" Feb 25 12:03:09 crc kubenswrapper[4725]: I0225 12:03:09.737940 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b8b9cdb6b-d9zj4_b77182d3-74cf-4a61-a3a1-81efff62da8d/barbican-keystone-listener-log/0.log" Feb 25 12:03:09 crc kubenswrapper[4725]: I0225 12:03:09.800162 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6df8d5688f-fkmbb_09976716-81ab-4d43-8250-fe3812bc8029/barbican-worker/0.log" Feb 25 12:03:09 crc kubenswrapper[4725]: I0225 12:03:09.902713 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6df8d5688f-fkmbb_09976716-81ab-4d43-8250-fe3812bc8029/barbican-worker-log/0.log" Feb 25 12:03:09 crc kubenswrapper[4725]: I0225 12:03:09.959295 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s8lfl_a1b2db62-0e44-475c-bd55-aeceb2068aed/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.090659 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/ceilometer-central-agent/1.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.181286 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/ceilometer-notification-agent/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.187248 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/ceilometer-central-agent/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.211447 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/proxy-httpd/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.241288 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2b92e78-7b23-469e-9220-9ea38d9cba32/sg-core/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.424703 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca608800-07d2-4b62-8ac2-e544a667d664/cinder-api-log/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.448817 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ca608800-07d2-4b62-8ac2-e544a667d664/cinder-api/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.646667 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a023b0b-cd51-47db-9fdf-74c673713272/cinder-scheduler/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.650136 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a023b0b-cd51-47db-9fdf-74c673713272/probe/0.log" Feb 25 12:03:10 crc kubenswrapper[4725]: I0225 12:03:10.712978 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kcdcb_d3ef192a-3ad7-445f-b029-580b9e395372/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.174282 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pwpj5_3118e370-4c72-4fc4-bf2b-d27645473666/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.216290 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-hrfcv_f0789964-49e9-49e9-a6f5-133761c0d9f8/init/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.406658 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-hrfcv_f0789964-49e9-49e9-a6f5-133761c0d9f8/init/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.441271 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-hrfcv_f0789964-49e9-49e9-a6f5-133761c0d9f8/dnsmasq-dns/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.452565 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lzgm6_5bbf0497-1315-4613-b6ff-c826f5cf2a75/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.555709 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.555769 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.632948 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e/glance-httpd/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.650050 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c0e3ea4a-8acb-4eee-a051-82ef6d7dad0e/glance-log/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.823493 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_993eb2eb-155b-419e-85a7-c59a25492dda/glance-log/0.log" Feb 25 12:03:11 crc kubenswrapper[4725]: I0225 12:03:11.835202 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_993eb2eb-155b-419e-85a7-c59a25492dda/glance-httpd/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.004616 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7cbf649584-gsrdx_f017ec2d-5d1b-405c-b2f7-b3212e3696d7/horizon/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.219388 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-skp48_b848df94-cae6-4ec8-bade-58be45c1cb4e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.407119 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gznp5_b40ab19d-a233-4263-b29f-390b5069752d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.438978 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7cbf649584-gsrdx_f017ec2d-5d1b-405c-b2f7-b3212e3696d7/horizon-log/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.663625 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29533681-fkv8s_bc1ee72a-eece-401e-9998-5570f2d5db12/keystone-cron/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.690930 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7dcb568bf7-chvcs_8145d393-0967-4acc-bd07-befcc3252202/keystone-api/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.835201 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c0e72df9-3fcc-4373-b1af-fac9d1bc5e99/kube-state-metrics/0.log" Feb 25 12:03:12 crc kubenswrapper[4725]: I0225 12:03:12.907600 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fxn8n_6c225171-2b3a-414b-94d4-d73cc4d28b97/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:13 crc kubenswrapper[4725]: I0225 12:03:13.233375 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58868cbfd5-pvwdv_4971206d-e6f2-4355-8c47-9a7c9e1e51d6/neutron-api/0.log" Feb 25 12:03:13 crc kubenswrapper[4725]: I0225 12:03:13.294966 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58868cbfd5-pvwdv_4971206d-e6f2-4355-8c47-9a7c9e1e51d6/neutron-httpd/0.log" Feb 25 12:03:13 crc kubenswrapper[4725]: I0225 12:03:13.322355 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nhjk9_9479ee63-ae8c-4dfb-87f0-d92785a85f3b/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:13 crc kubenswrapper[4725]: I0225 12:03:13.966884 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e95c876-3305-4b1d-9062-dffe7e184ffd/nova-api-log/0.log" Feb 25 12:03:13 crc kubenswrapper[4725]: I0225 12:03:13.997068 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_96799ef3-bd2e-4b3a-bc08-6c0b66dc46c6/nova-cell0-conductor-conductor/0.log" Feb 25 12:03:14 crc kubenswrapper[4725]: I0225 12:03:14.226193 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1e17e12f-d899-470f-8087-b92c47f46c5b/nova-cell1-conductor-conductor/0.log" Feb 25 12:03:14 crc kubenswrapper[4725]: I0225 12:03:14.376777 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_53df7811-b191-4c54-b2c4-5faed23e2cc3/nova-cell1-novncproxy-novncproxy/0.log" Feb 25 12:03:14 crc kubenswrapper[4725]: I0225 12:03:14.406111 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e95c876-3305-4b1d-9062-dffe7e184ffd/nova-api-api/0.log" Feb 25 12:03:14 crc kubenswrapper[4725]: I0225 12:03:14.464226 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-n8lt7_4c1ac37f-ee50-4446-8433-5c3f1c427205/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:14 crc kubenswrapper[4725]: I0225 12:03:14.706295 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_670a8e0c-fb4b-4311-b236-41a3f10c1ad2/nova-metadata-log/0.log" Feb 25 12:03:14 crc kubenswrapper[4725]: I0225 12:03:14.987687 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_99ef16ee-b18a-4374-9b14-0d6e08df5558/mysql-bootstrap/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.137157 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fe5c0a24-642c-4173-9b00-3d5a327f669e/nova-scheduler-scheduler/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.153452 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_99ef16ee-b18a-4374-9b14-0d6e08df5558/mysql-bootstrap/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.229795 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_99ef16ee-b18a-4374-9b14-0d6e08df5558/galera/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.387081 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6c23a18-36cf-4d71-885d-f2b93ba16375/mysql-bootstrap/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.548015 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6c23a18-36cf-4d71-885d-f2b93ba16375/mysql-bootstrap/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.598453 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6c23a18-36cf-4d71-885d-f2b93ba16375/galera/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.753282 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_71cbcb8e-872e-48b4-93a9-f5ee2edb3746/openstackclient/0.log" Feb 25 12:03:15 crc kubenswrapper[4725]: I0225 12:03:15.831435 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hlw77_82a07d0a-26d5-463c-95aa-eb022c49ac9d/openstack-network-exporter/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.076608 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_670a8e0c-fb4b-4311-b236-41a3f10c1ad2/nova-metadata-metadata/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.246966 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovsdb-server-init/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.454500 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovsdb-server/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.463618 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovs-vswitchd/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.469058 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-drphb_493d04a9-b969-4c11-bd84-a1e9d57b7772/ovsdb-server-init/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.663606 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xpvnr_d2445fb4-75ca-4ea2-b979-5757105279ab/ovn-controller/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.773918 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rgwzr_65453adf-918b-40e1-bce0-4d4cb4ab7f56/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.874351 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_254996fe-9d34-46de-8e63-d4762c639a24/openstack-network-exporter/0.log" Feb 25 12:03:16 crc kubenswrapper[4725]: I0225 12:03:16.971933 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_254996fe-9d34-46de-8e63-d4762c639a24/ovn-northd/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.056697 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_24e787b7-ef1d-4c61-b01a-f8119d7911c0/openstack-network-exporter/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.105783 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_24e787b7-ef1d-4c61-b01a-f8119d7911c0/ovsdbserver-nb/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.278547 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_818d1929-2446-4ce6-80ec-6ed3fdec2b3d/openstack-network-exporter/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.302862 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_818d1929-2446-4ce6-80ec-6ed3fdec2b3d/ovsdbserver-sb/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.560204 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69c7668f4d-s7tf6_502da0ce-a7f4-4af1-87a8-f9a7bb197b39/placement-log/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.572451 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69c7668f4d-s7tf6_502da0ce-a7f4-4af1-87a8-f9a7bb197b39/placement-api/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.587694 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5bb7295b-193b-45b6-8913-8508d190e664/setup-container/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.790187 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5bb7295b-193b-45b6-8913-8508d190e664/setup-container/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.877971 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5bb7295b-193b-45b6-8913-8508d190e664/rabbitmq/0.log" Feb 25 12:03:17 crc kubenswrapper[4725]: I0225 12:03:17.897468 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8cd71ea0-569c-4093-931d-2e0c841bcbf4/setup-container/0.log" Feb 25 12:03:18 crc kubenswrapper[4725]: I0225 12:03:18.094645 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8cd71ea0-569c-4093-931d-2e0c841bcbf4/rabbitmq/0.log" Feb 25 12:03:18 crc kubenswrapper[4725]: I0225 12:03:18.100921 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8cd71ea0-569c-4093-931d-2e0c841bcbf4/setup-container/0.log" Feb 25 12:03:18 crc kubenswrapper[4725]: I0225 12:03:18.177330 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-cbgnq_2b6f1103-ca9d-4e09-9816-83e1751a56ff/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:18 crc kubenswrapper[4725]: I0225 12:03:18.292433 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-mlkj8_a8206236-adf4-4501-bbc7-6333709aa101/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:18 crc kubenswrapper[4725]: I0225 12:03:18.561452 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jrhdw_c034211a-1e4c-4636-9f07-a8c4b89bed34/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.253449 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gx8f9_6583b5b3-bae7-4cbc-a3ce-568b1c7e5bd8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.374593 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rfwzw_f5f7958b-17b2-40ba-a17b-bc8eefa6d59d/ssh-known-hosts-edpm-deployment/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.571275 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-zc6sk_c5574881-8546-456a-96b2-d58158e8a447/swift-ring-rebalance/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.581239 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d86f859c9-f94qp_cdb91fb4-91c1-4761-8724-24a845ee9d03/proxy-server/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.616533 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d86f859c9-f94qp_cdb91fb4-91c1-4761-8724-24a845ee9d03/proxy-httpd/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.768730 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-auditor/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.838443 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-reaper/0.log" Feb 25 12:03:19 crc kubenswrapper[4725]: I0225 12:03:19.846555 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-replicator/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.006132 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/account-server/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.028661 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-server/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.036351 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-auditor/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.064652 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-replicator/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.200382 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/container-updater/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.249144 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-auditor/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.293940 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-replicator/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.308003 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-expirer/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.405701 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-server/0.log" Feb 25 12:03:20 crc kubenswrapper[4725]: I0225 12:03:20.923052 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/object-updater/0.log" Feb 25 12:03:21 crc kubenswrapper[4725]: I0225 12:03:21.080454 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/rsync/0.log" Feb 25 12:03:21 crc kubenswrapper[4725]: I0225 12:03:21.122669 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d922deba-d455-45a7-ade3-dc2f588617bc/swift-recon-cron/0.log" Feb 25 12:03:21 crc kubenswrapper[4725]: I0225 12:03:21.245578 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dg75m_07f2c78c-f46d-4751-ae1b-ac502a378ff4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:21 crc kubenswrapper[4725]: I0225 12:03:21.315664 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_07081f50-997d-4877-be58-a446955dfe62/tempest-tests-tempest-tests-runner/0.log" Feb 25 12:03:21 crc kubenswrapper[4725]: I0225 12:03:21.483069 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6d6cd6ff-f8a8-4cab-b786-90440d19dbf1/test-operator-logs-container/0.log" Feb 25 12:03:21 crc kubenswrapper[4725]: I0225 12:03:21.588905 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-p46vh_6d50a11f-90c8-490f-90a3-9fb2c14f2bea/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 12:03:30 crc kubenswrapper[4725]: I0225 12:03:30.801403 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a30e3088-499a-491e-a9b0-65e54ac709c9/memcached/0.log" Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.556041 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.556629 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.556681 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.557565 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b036620e875f4a758dd804181c8957fd14a1029d422786a3424f55fe7e40b96c"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.557645 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://b036620e875f4a758dd804181c8957fd14a1029d422786a3424f55fe7e40b96c" gracePeriod=600 Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.707237 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="b036620e875f4a758dd804181c8957fd14a1029d422786a3424f55fe7e40b96c" exitCode=0 Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.707279 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"b036620e875f4a758dd804181c8957fd14a1029d422786a3424f55fe7e40b96c"} Feb 25 12:03:41 crc kubenswrapper[4725]: I0225 12:03:41.707311 4725 scope.go:117] "RemoveContainer" containerID="4cc992bd547e14ab0017b3cc4957ea7620548f5adef615c548e1b9c13b50ed0c" Feb 25 12:03:42 crc kubenswrapper[4725]: I0225 12:03:42.720222 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b"} Feb 25 12:03:48 crc kubenswrapper[4725]: I0225 12:03:48.918382 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-gm94c_a897851d-6b6d-40e1-82f2-ef4db97b19d9/manager/0.log" Feb 25 12:03:49 crc kubenswrapper[4725]: I0225 12:03:49.180590 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/util/0.log" Feb 25 12:03:49 crc kubenswrapper[4725]: I0225 12:03:49.372156 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/util/0.log" Feb 25 12:03:49 crc kubenswrapper[4725]: I0225 12:03:49.404002 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/pull/0.log" Feb 25 12:03:49 crc kubenswrapper[4725]: I0225 12:03:49.634810 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/pull/0.log" Feb 25 12:03:49 crc kubenswrapper[4725]: I0225 12:03:49.780907 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/util/0.log" Feb 25 12:03:49 crc kubenswrapper[4725]: I0225 12:03:49.894245 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/pull/0.log" Feb 25 12:03:49 crc kubenswrapper[4725]: I0225 12:03:49.997012 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f06060f0c970ce8ad3cc61c762fa6b74efe155eb51d096e6bcd2c302019d68k_f00c3456-1352-4fa0-90e7-44648edcf473/extract/0.log" Feb 25 12:03:50 crc kubenswrapper[4725]: I0225 12:03:50.346105 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-97g26_22854bfa-3684-4750-b2f7-e5ccbe3e92fb/manager/0.log" Feb 25 12:03:50 crc kubenswrapper[4725]: I0225 12:03:50.560315 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-wj5dw_0755d178-0ceb-41f1-a26c-e96e466f8300/manager/0.log" Feb 25 12:03:50 crc kubenswrapper[4725]: I0225 12:03:50.933204 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-j4hbq_6cf86133-a9ef-4a8b-a957-ef8e588b200e/manager/0.log" Feb 25 12:03:50 crc kubenswrapper[4725]: I0225 12:03:50.941732 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-65rfv_27540507-aac9-4fd2-84a9-34a2a20885d7/manager/0.log" Feb 25 12:03:51 crc kubenswrapper[4725]: I0225 12:03:51.255211 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-h2tmg_015fdc09-2359-48f1-9800-9d44efc254fc/manager/0.log" Feb 25 12:03:51 crc kubenswrapper[4725]: I0225 12:03:51.588417 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-6872z_b82c26d2-a08f-4c57-a876-9ac8a87c1fcf/manager/0.log" Feb 25 12:03:51 crc kubenswrapper[4725]: I0225 12:03:51.634197 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-2vhq7_9a7b2bf7-fab5-4634-9dfa-147dc2de21bc/manager/0.log" Feb 25 12:03:51 crc kubenswrapper[4725]: I0225 12:03:51.821738 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-25sql_0279e1a1-c275-48e8-815c-0afae718b93a/manager/0.log" Feb 25 12:03:51 crc kubenswrapper[4725]: I0225 12:03:51.828968 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-6s7s5_5b458e63-ce2e-4d37-9509-5b31170d932f/manager/0.log" Feb 25 12:03:52 crc kubenswrapper[4725]: I0225 12:03:52.059023 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-pxnr7_c07a7a9d-d976-4d10-af1d-b92b5da76d71/manager/0.log" Feb 25 12:03:52 crc kubenswrapper[4725]: I0225 12:03:52.213482 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-kn6fp_ba6741a0-f2ce-464b-aaa4-eafa6f4f0eb6/manager/0.log" Feb 25 12:03:52 crc kubenswrapper[4725]: I0225 12:03:52.269062 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-8fthg_37d48839-36c8-4a2c-ac3d-a4e5394b11eb/manager/0.log" Feb 25 12:03:52 crc kubenswrapper[4725]: I0225 12:03:52.439383 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c8skvd_2fbb069d-66ce-4d87-9fcb-f82181bd85e9/manager/0.log" Feb 25 12:03:52 crc kubenswrapper[4725]: I0225 12:03:52.699092 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-74c9788cdf-zqhdj_267ba587-2b6d-4cfa-9e0b-2b8fce4d5bfe/operator/0.log" Feb 25 12:03:52 crc kubenswrapper[4725]: I0225 12:03:52.822645 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sn5wb_52de4181-d70f-4961-abfe-957862ec7ed0/registry-server/0.log" Feb 25 12:03:53 crc kubenswrapper[4725]: I0225 12:03:53.073271 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-lgqlc_07870810-90ed-47a5-90f5-b684700f7092/manager/0.log" Feb 25 12:03:53 crc kubenswrapper[4725]: I0225 12:03:53.144664 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-v8c26_4b18c8c4-1868-4383-b2d7-d9b3c9a33e03/manager/0.log" Feb 25 12:03:53 crc kubenswrapper[4725]: I0225 12:03:53.320289 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-bzx24_9921b017-bf1b-457d-b9ec-b344b0fabd1c/operator/0.log" Feb 25 12:03:53 crc kubenswrapper[4725]: I0225 12:03:53.506439 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-mvqqg_01823ef1-1bcc-49f8-8cbc-37db7edc9fd0/manager/0.log" Feb 25 12:03:53 crc kubenswrapper[4725]: I0225 12:03:53.722356 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-8gchs_2b257035-93ff-456f-8aaa-e370a1756b0e/manager/0.log" Feb 25 12:03:53 crc kubenswrapper[4725]: I0225 12:03:53.739484 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-t2ncn_cf5974e9-29dc-4274-8f65-9cf82450bdfc/manager/0.log" Feb 25 12:03:53 crc kubenswrapper[4725]: I0225 12:03:53.970308 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-6lfbp_e1b06e72-2952-4eee-9732-af05abc6a117/manager/0.log" Feb 25 12:03:54 crc kubenswrapper[4725]: I0225 12:03:54.440207 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7489bcf59c-kb5pq_b6b802f9-7adb-43ca-b8ae-de7bacb908fb/manager/0.log" Feb 25 12:03:58 crc kubenswrapper[4725]: I0225 12:03:58.573018 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-l278b_41775582-fd78-4c34-93fc-60b9cdc55a2c/manager/0.log" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.167352 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533684-8k7pj"] Feb 25 12:04:00 crc kubenswrapper[4725]: E0225 12:04:00.168577 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d79abb-8de6-4166-812b-00b659f308cc" containerName="container-00" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.168597 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d79abb-8de6-4166-812b-00b659f308cc" containerName="container-00" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.168955 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d79abb-8de6-4166-812b-00b659f308cc" containerName="container-00" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.169960 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533684-8k7pj" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.171806 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.171925 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.172981 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.202632 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533684-8k7pj"] Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.248964 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z765\" (UniqueName: \"kubernetes.io/projected/2cdd6fbf-7184-4d92-a775-42d116b9491a-kube-api-access-7z765\") pod \"auto-csr-approver-29533684-8k7pj\" (UID: \"2cdd6fbf-7184-4d92-a775-42d116b9491a\") " pod="openshift-infra/auto-csr-approver-29533684-8k7pj" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.351900 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z765\" (UniqueName: \"kubernetes.io/projected/2cdd6fbf-7184-4d92-a775-42d116b9491a-kube-api-access-7z765\") pod \"auto-csr-approver-29533684-8k7pj\" (UID: \"2cdd6fbf-7184-4d92-a775-42d116b9491a\") " pod="openshift-infra/auto-csr-approver-29533684-8k7pj" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.378457 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z765\" (UniqueName: \"kubernetes.io/projected/2cdd6fbf-7184-4d92-a775-42d116b9491a-kube-api-access-7z765\") pod \"auto-csr-approver-29533684-8k7pj\" (UID: \"2cdd6fbf-7184-4d92-a775-42d116b9491a\") " pod="openshift-infra/auto-csr-approver-29533684-8k7pj" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.491783 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533684-8k7pj" Feb 25 12:04:00 crc kubenswrapper[4725]: I0225 12:04:00.989362 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533684-8k7pj"] Feb 25 12:04:00 crc kubenswrapper[4725]: W0225 12:04:00.993992 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cdd6fbf_7184_4d92_a775_42d116b9491a.slice/crio-715d56fbbfc95b7e32ec00e3d43cc4ec4b258b8bfa952a44f15b304e993fa0de WatchSource:0}: Error finding container 715d56fbbfc95b7e32ec00e3d43cc4ec4b258b8bfa952a44f15b304e993fa0de: Status 404 returned error can't find the container with id 715d56fbbfc95b7e32ec00e3d43cc4ec4b258b8bfa952a44f15b304e993fa0de Feb 25 12:04:01 crc kubenswrapper[4725]: I0225 12:04:01.866163 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533684-8k7pj" event={"ID":"2cdd6fbf-7184-4d92-a775-42d116b9491a","Type":"ContainerStarted","Data":"715d56fbbfc95b7e32ec00e3d43cc4ec4b258b8bfa952a44f15b304e993fa0de"} Feb 25 12:04:02 crc kubenswrapper[4725]: I0225 12:04:02.875909 4725 generic.go:334] "Generic (PLEG): container finished" podID="2cdd6fbf-7184-4d92-a775-42d116b9491a" containerID="07a354ad90a6a6a96002772adaed084e914c2e7826509075f6b4becd39b3f7c8" exitCode=0 Feb 25 12:04:02 crc kubenswrapper[4725]: I0225 12:04:02.876003 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533684-8k7pj" event={"ID":"2cdd6fbf-7184-4d92-a775-42d116b9491a","Type":"ContainerDied","Data":"07a354ad90a6a6a96002772adaed084e914c2e7826509075f6b4becd39b3f7c8"} Feb 25 12:04:04 crc kubenswrapper[4725]: I0225 12:04:04.319668 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533684-8k7pj" Feb 25 12:04:04 crc kubenswrapper[4725]: I0225 12:04:04.425422 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z765\" (UniqueName: \"kubernetes.io/projected/2cdd6fbf-7184-4d92-a775-42d116b9491a-kube-api-access-7z765\") pod \"2cdd6fbf-7184-4d92-a775-42d116b9491a\" (UID: \"2cdd6fbf-7184-4d92-a775-42d116b9491a\") " Feb 25 12:04:04 crc kubenswrapper[4725]: I0225 12:04:04.440120 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdd6fbf-7184-4d92-a775-42d116b9491a-kube-api-access-7z765" (OuterVolumeSpecName: "kube-api-access-7z765") pod "2cdd6fbf-7184-4d92-a775-42d116b9491a" (UID: "2cdd6fbf-7184-4d92-a775-42d116b9491a"). InnerVolumeSpecName "kube-api-access-7z765". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:04:04 crc kubenswrapper[4725]: I0225 12:04:04.527399 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z765\" (UniqueName: \"kubernetes.io/projected/2cdd6fbf-7184-4d92-a775-42d116b9491a-kube-api-access-7z765\") on node \"crc\" DevicePath \"\"" Feb 25 12:04:04 crc kubenswrapper[4725]: I0225 12:04:04.894619 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533684-8k7pj" event={"ID":"2cdd6fbf-7184-4d92-a775-42d116b9491a","Type":"ContainerDied","Data":"715d56fbbfc95b7e32ec00e3d43cc4ec4b258b8bfa952a44f15b304e993fa0de"} Feb 25 12:04:04 crc kubenswrapper[4725]: I0225 12:04:04.894679 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="715d56fbbfc95b7e32ec00e3d43cc4ec4b258b8bfa952a44f15b304e993fa0de" Feb 25 12:04:04 crc kubenswrapper[4725]: I0225 12:04:04.894652 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533684-8k7pj" Feb 25 12:04:05 crc kubenswrapper[4725]: I0225 12:04:05.390555 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533678-nck22"] Feb 25 12:04:05 crc kubenswrapper[4725]: I0225 12:04:05.401007 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533678-nck22"] Feb 25 12:04:07 crc kubenswrapper[4725]: I0225 12:04:07.235766 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aec2150-ee04-432f-b0d3-a8a91d1eca11" path="/var/lib/kubelet/pods/6aec2150-ee04-432f-b0d3-a8a91d1eca11/volumes" Feb 25 12:04:14 crc kubenswrapper[4725]: I0225 12:04:14.678717 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gbzbf_93efef4f-c6c1-47b8-ba83-12c56c3b08ea/control-plane-machine-set-operator/0.log" Feb 25 12:04:14 crc kubenswrapper[4725]: I0225 12:04:14.869486 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mw7b2_58ea6113-66d2-421d-b7cd-723463055f04/kube-rbac-proxy/0.log" Feb 25 12:04:14 crc kubenswrapper[4725]: I0225 12:04:14.903869 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mw7b2_58ea6113-66d2-421d-b7cd-723463055f04/machine-api-operator/0.log" Feb 25 12:04:22 crc kubenswrapper[4725]: I0225 12:04:22.997018 4725 scope.go:117] "RemoveContainer" containerID="69ffe4928c4b6e8ae537c96d58a3ab872147a72b0aa87a6e3ec8fc52632da7c4" Feb 25 12:04:28 crc kubenswrapper[4725]: I0225 12:04:28.201582 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-k9jtf_32a638a3-425e-4564-b5e1-b11c3d332ed6/cert-manager-controller/0.log" Feb 25 12:04:28 crc kubenswrapper[4725]: I0225 12:04:28.316519 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6xsxb_5f1f7118-2524-4653-9a60-82142d16ef44/cert-manager-cainjector/0.log" Feb 25 12:04:28 crc kubenswrapper[4725]: I0225 12:04:28.398517 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-6wnjw_899495ee-adcf-4350-a1b3-6a3cdd8c9d42/cert-manager-webhook/0.log" Feb 25 12:04:40 crc kubenswrapper[4725]: I0225 12:04:40.761509 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-mk4rx_0b1364b5-8725-454d-962e-a8c86ca27c2b/nmstate-console-plugin/0.log" Feb 25 12:04:40 crc kubenswrapper[4725]: I0225 12:04:40.914892 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-4xp96_82055e0c-d941-42a4-a029-e58f3893b303/nmstate-handler/0.log" Feb 25 12:04:40 crc kubenswrapper[4725]: I0225 12:04:40.998937 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-w2z2q_e0523051-56ca-4df3-ae89-488db2c9c37a/kube-rbac-proxy/0.log" Feb 25 12:04:41 crc kubenswrapper[4725]: I0225 12:04:41.055331 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-w2z2q_e0523051-56ca-4df3-ae89-488db2c9c37a/nmstate-metrics/0.log" Feb 25 12:04:41 crc kubenswrapper[4725]: I0225 12:04:41.175788 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rrwts_7d7a9448-ae03-426a-8c08-5823c6097b8c/nmstate-operator/0.log" Feb 25 12:04:41 crc kubenswrapper[4725]: I0225 12:04:41.222932 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-sgrvv_93f80fd4-e221-4c81-ab48-77beb578add9/nmstate-webhook/0.log" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.496330 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4kf5l"] Feb 25 12:05:07 crc kubenswrapper[4725]: E0225 12:05:07.497287 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdd6fbf-7184-4d92-a775-42d116b9491a" containerName="oc" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.497300 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdd6fbf-7184-4d92-a775-42d116b9491a" containerName="oc" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.497511 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdd6fbf-7184-4d92-a775-42d116b9491a" containerName="oc" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.499112 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.516023 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kf5l"] Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.566877 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-catalog-content\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.566944 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-utilities\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.566966 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvr4h\" (UniqueName: \"kubernetes.io/projected/a0620879-9fd0-4e14-ba25-e2624fadcd53-kube-api-access-dvr4h\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.669091 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-utilities\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.669391 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvr4h\" (UniqueName: \"kubernetes.io/projected/a0620879-9fd0-4e14-ba25-e2624fadcd53-kube-api-access-dvr4h\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.669551 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-catalog-content\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.669893 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-utilities\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.669967 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-catalog-content\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.692753 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvr4h\" (UniqueName: \"kubernetes.io/projected/a0620879-9fd0-4e14-ba25-e2624fadcd53-kube-api-access-dvr4h\") pod \"redhat-marketplace-4kf5l\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:07 crc kubenswrapper[4725]: I0225 12:05:07.819565 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.345944 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kf5l"] Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.426179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kf5l" event={"ID":"a0620879-9fd0-4e14-ba25-e2624fadcd53","Type":"ContainerStarted","Data":"4251bfe92444022ea2101542be2fa38403bd735a21d75acbced37a1224e7078a"} Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.467312 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-67xsd_beb65949-bc67-4f16-892c-8979cc412e9e/kube-rbac-proxy/0.log" Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.485068 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-67xsd_beb65949-bc67-4f16-892c-8979cc412e9e/controller/0.log" Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.647533 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.825031 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.837784 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.857205 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 12:05:08 crc kubenswrapper[4725]: I0225 12:05:08.893410 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.033699 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.080459 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.091324 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.104455 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.243993 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-frr-files/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.292089 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-metrics/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.315782 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/cp-reloader/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.319073 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/controller/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.463587 4725 generic.go:334] "Generic (PLEG): container finished" podID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerID="b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d" exitCode=0 Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.463627 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kf5l" event={"ID":"a0620879-9fd0-4e14-ba25-e2624fadcd53","Type":"ContainerDied","Data":"b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d"} Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.481861 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/frr-metrics/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.529066 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/kube-rbac-proxy-frr/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.541007 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/kube-rbac-proxy/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.685348 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/reloader/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.757105 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-xn4fl_bcc5161a-6f59-4878-a2ae-5f4a533021c3/frr-k8s-webhook-server/0.log" Feb 25 12:05:09 crc kubenswrapper[4725]: I0225 12:05:09.957095 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-768ffd8bd5-q5ktr_14923832-70ad-4019-b795-4094d767dfda/manager/0.log" Feb 25 12:05:10 crc kubenswrapper[4725]: I0225 12:05:10.117725 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56448fcbcf-jpqnm_50bf643b-abcd-4134-bfd5-a08256ad5652/webhook-server/0.log" Feb 25 12:05:10 crc kubenswrapper[4725]: I0225 12:05:10.225846 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-svwnh_b577777e-718a-4f09-a76a-98aa4f068184/kube-rbac-proxy/0.log" Feb 25 12:05:10 crc kubenswrapper[4725]: I0225 12:05:10.792679 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-svwnh_b577777e-718a-4f09-a76a-98aa4f068184/speaker/0.log" Feb 25 12:05:11 crc kubenswrapper[4725]: I0225 12:05:11.024084 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjqx6_0ced2390-9bb3-44f1-a851-994322d83bff/frr/0.log" Feb 25 12:05:11 crc kubenswrapper[4725]: I0225 12:05:11.487007 4725 generic.go:334] "Generic (PLEG): container finished" podID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerID="4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716" exitCode=0 Feb 25 12:05:11 crc kubenswrapper[4725]: I0225 12:05:11.487076 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kf5l" event={"ID":"a0620879-9fd0-4e14-ba25-e2624fadcd53","Type":"ContainerDied","Data":"4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716"} Feb 25 12:05:13 crc kubenswrapper[4725]: I0225 12:05:13.514937 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kf5l" event={"ID":"a0620879-9fd0-4e14-ba25-e2624fadcd53","Type":"ContainerStarted","Data":"5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628"} Feb 25 12:05:13 crc kubenswrapper[4725]: I0225 12:05:13.537468 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4kf5l" podStartSLOduration=3.047846948 podStartE2EDuration="6.537446941s" podCreationTimestamp="2026-02-25 12:05:07 +0000 UTC" firstStartedPulling="2026-02-25 12:05:09.467997559 +0000 UTC m=+4334.966579584" lastFinishedPulling="2026-02-25 12:05:12.957597532 +0000 UTC m=+4338.456179577" observedRunningTime="2026-02-25 12:05:13.532937652 +0000 UTC m=+4339.031519717" watchObservedRunningTime="2026-02-25 12:05:13.537446941 +0000 UTC m=+4339.036028976" Feb 25 12:05:17 crc kubenswrapper[4725]: I0225 12:05:17.819747 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:17 crc kubenswrapper[4725]: I0225 12:05:17.820357 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:17 crc kubenswrapper[4725]: I0225 12:05:17.878318 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:18 crc kubenswrapper[4725]: I0225 12:05:18.607684 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:18 crc kubenswrapper[4725]: I0225 12:05:18.661808 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kf5l"] Feb 25 12:05:20 crc kubenswrapper[4725]: I0225 12:05:20.578640 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4kf5l" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="registry-server" containerID="cri-o://5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628" gracePeriod=2 Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.322384 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.407416 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvr4h\" (UniqueName: \"kubernetes.io/projected/a0620879-9fd0-4e14-ba25-e2624fadcd53-kube-api-access-dvr4h\") pod \"a0620879-9fd0-4e14-ba25-e2624fadcd53\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.407455 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-utilities\") pod \"a0620879-9fd0-4e14-ba25-e2624fadcd53\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.407486 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-catalog-content\") pod \"a0620879-9fd0-4e14-ba25-e2624fadcd53\" (UID: \"a0620879-9fd0-4e14-ba25-e2624fadcd53\") " Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.408871 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-utilities" (OuterVolumeSpecName: "utilities") pod "a0620879-9fd0-4e14-ba25-e2624fadcd53" (UID: "a0620879-9fd0-4e14-ba25-e2624fadcd53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.415617 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0620879-9fd0-4e14-ba25-e2624fadcd53-kube-api-access-dvr4h" (OuterVolumeSpecName: "kube-api-access-dvr4h") pod "a0620879-9fd0-4e14-ba25-e2624fadcd53" (UID: "a0620879-9fd0-4e14-ba25-e2624fadcd53"). InnerVolumeSpecName "kube-api-access-dvr4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.442098 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0620879-9fd0-4e14-ba25-e2624fadcd53" (UID: "a0620879-9fd0-4e14-ba25-e2624fadcd53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.510004 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvr4h\" (UniqueName: \"kubernetes.io/projected/a0620879-9fd0-4e14-ba25-e2624fadcd53-kube-api-access-dvr4h\") on node \"crc\" DevicePath \"\"" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.510042 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.510054 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0620879-9fd0-4e14-ba25-e2624fadcd53-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.587770 4725 generic.go:334] "Generic (PLEG): container finished" podID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerID="5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628" exitCode=0 Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.587807 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kf5l" event={"ID":"a0620879-9fd0-4e14-ba25-e2624fadcd53","Type":"ContainerDied","Data":"5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628"} Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.587879 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kf5l" event={"ID":"a0620879-9fd0-4e14-ba25-e2624fadcd53","Type":"ContainerDied","Data":"4251bfe92444022ea2101542be2fa38403bd735a21d75acbced37a1224e7078a"} Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.587921 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kf5l" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.587921 4725 scope.go:117] "RemoveContainer" containerID="5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.607354 4725 scope.go:117] "RemoveContainer" containerID="4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.627544 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kf5l"] Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.636997 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kf5l"] Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.648587 4725 scope.go:117] "RemoveContainer" containerID="b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.683164 4725 scope.go:117] "RemoveContainer" containerID="5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628" Feb 25 12:05:21 crc kubenswrapper[4725]: E0225 12:05:21.683683 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628\": container with ID starting with 5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628 not found: ID does not exist" containerID="5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.683739 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628"} err="failed to get container status \"5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628\": rpc error: code = NotFound desc = could not find container \"5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628\": container with ID starting with 5b4cf1e3b50b9e686a48d9ca5017d173614949044aaf276a77eb6a6a69423628 not found: ID does not exist" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.683773 4725 scope.go:117] "RemoveContainer" containerID="4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716" Feb 25 12:05:21 crc kubenswrapper[4725]: E0225 12:05:21.685891 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716\": container with ID starting with 4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716 not found: ID does not exist" containerID="4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.685919 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716"} err="failed to get container status \"4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716\": rpc error: code = NotFound desc = could not find container \"4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716\": container with ID starting with 4cd3fb8b61e07d1c6be4153542b9bfda686bb5a833a4eb94707d0f8d24045716 not found: ID does not exist" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.685935 4725 scope.go:117] "RemoveContainer" containerID="b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d" Feb 25 12:05:21 crc kubenswrapper[4725]: E0225 12:05:21.686239 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d\": container with ID starting with b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d not found: ID does not exist" containerID="b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d" Feb 25 12:05:21 crc kubenswrapper[4725]: I0225 12:05:21.686280 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d"} err="failed to get container status \"b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d\": rpc error: code = NotFound desc = could not find container \"b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d\": container with ID starting with b6d47f263563f9bf7f27200290d8127c3da87051abb3b687ecae75b5e4d66c9d not found: ID does not exist" Feb 25 12:05:23 crc kubenswrapper[4725]: I0225 12:05:23.239308 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" path="/var/lib/kubelet/pods/a0620879-9fd0-4e14-ba25-e2624fadcd53/volumes" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.249788 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/util/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.415327 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/pull/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.431245 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/util/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.452699 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/pull/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.619196 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/pull/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.634018 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/util/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.634734 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213kp4xw_c5e3d2f9-7701-4ab5-a043-64fe366bc324/extract/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.762657 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-utilities/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.934697 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-utilities/0.log" Feb 25 12:05:24 crc kubenswrapper[4725]: I0225 12:05:24.946082 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-content/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.004768 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-content/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.109875 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-content/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.129446 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/extract-utilities/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.302854 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-utilities/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.596140 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-content/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.613006 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-utilities/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.640556 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-content/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.683034 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vqq6w_b58eda4b-360e-4504-a3be-a409e8225852/registry-server/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.884272 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-content/0.log" Feb 25 12:05:25 crc kubenswrapper[4725]: I0225 12:05:25.884494 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/extract-utilities/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.066372 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/util/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.334927 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/pull/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.344570 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/util/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.351395 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/pull/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.368070 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5mr86_12d4b15a-99ab-4671-bc50-6790e38d355c/registry-server/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.495463 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/util/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.527980 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/pull/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.530503 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecal667d_6b4bc033-9181-40c7-8264-19b5a49c8e7f/extract/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.710885 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k82sj_d18563e5-7e1f-4e98-9419-d71fa34b9fd2/marketplace-operator/0.log" Feb 25 12:05:26 crc kubenswrapper[4725]: I0225 12:05:26.732715 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-utilities/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.078431 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-content/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.083863 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-utilities/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.118258 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-content/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.256977 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-content/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.323598 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/extract-utilities/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.388694 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g7f9h_15f43ce2-181a-480f-9ea5-c608d2d414c4/registry-server/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.475545 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-utilities/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.653686 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-content/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.656130 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-utilities/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.676397 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-content/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.896571 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-content/0.log" Feb 25 12:05:27 crc kubenswrapper[4725]: I0225 12:05:27.971700 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/extract-utilities/0.log" Feb 25 12:05:28 crc kubenswrapper[4725]: I0225 12:05:28.469347 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-99g6v_ece6c2fe-4eaa-4d6e-bb4a-2f229f45f57a/registry-server/0.log" Feb 25 12:05:41 crc kubenswrapper[4725]: I0225 12:05:41.554958 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 12:05:41 crc kubenswrapper[4725]: I0225 12:05:41.555374 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.138566 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533686-l277g"] Feb 25 12:06:00 crc kubenswrapper[4725]: E0225 12:06:00.139413 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="registry-server" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.139429 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="registry-server" Feb 25 12:06:00 crc kubenswrapper[4725]: E0225 12:06:00.139455 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="extract-content" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.139463 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="extract-content" Feb 25 12:06:00 crc kubenswrapper[4725]: E0225 12:06:00.139478 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="extract-utilities" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.139486 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="extract-utilities" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.139703 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0620879-9fd0-4e14-ba25-e2624fadcd53" containerName="registry-server" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.140508 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533686-l277g" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.143311 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.143472 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.145414 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.150926 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533686-l277g"] Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.254602 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t879d\" (UniqueName: \"kubernetes.io/projected/ce8c17bb-a8d3-4808-9dbb-d3e00020623d-kube-api-access-t879d\") pod \"auto-csr-approver-29533686-l277g\" (UID: \"ce8c17bb-a8d3-4808-9dbb-d3e00020623d\") " pod="openshift-infra/auto-csr-approver-29533686-l277g" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.357664 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t879d\" (UniqueName: \"kubernetes.io/projected/ce8c17bb-a8d3-4808-9dbb-d3e00020623d-kube-api-access-t879d\") pod \"auto-csr-approver-29533686-l277g\" (UID: \"ce8c17bb-a8d3-4808-9dbb-d3e00020623d\") " pod="openshift-infra/auto-csr-approver-29533686-l277g" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.386605 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t879d\" (UniqueName: \"kubernetes.io/projected/ce8c17bb-a8d3-4808-9dbb-d3e00020623d-kube-api-access-t879d\") pod \"auto-csr-approver-29533686-l277g\" (UID: \"ce8c17bb-a8d3-4808-9dbb-d3e00020623d\") " pod="openshift-infra/auto-csr-approver-29533686-l277g" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.463000 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533686-l277g" Feb 25 12:06:00 crc kubenswrapper[4725]: I0225 12:06:00.886974 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533686-l277g"] Feb 25 12:06:01 crc kubenswrapper[4725]: I0225 12:06:01.913059 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533686-l277g" event={"ID":"ce8c17bb-a8d3-4808-9dbb-d3e00020623d","Type":"ContainerStarted","Data":"800a3fe8984cbe87741a57e96845e9991b11740bf3cd20dd8bfeaa400789bcf9"} Feb 25 12:06:02 crc kubenswrapper[4725]: I0225 12:06:02.922058 4725 generic.go:334] "Generic (PLEG): container finished" podID="ce8c17bb-a8d3-4808-9dbb-d3e00020623d" containerID="ddce0d9a61c1210a33179fe0d45da524aa4f0511cd2932eeae8f6c6b93e3905c" exitCode=0 Feb 25 12:06:02 crc kubenswrapper[4725]: I0225 12:06:02.922144 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533686-l277g" event={"ID":"ce8c17bb-a8d3-4808-9dbb-d3e00020623d","Type":"ContainerDied","Data":"ddce0d9a61c1210a33179fe0d45da524aa4f0511cd2932eeae8f6c6b93e3905c"} Feb 25 12:06:04 crc kubenswrapper[4725]: I0225 12:06:04.291475 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533686-l277g" Feb 25 12:06:04 crc kubenswrapper[4725]: I0225 12:06:04.430708 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t879d\" (UniqueName: \"kubernetes.io/projected/ce8c17bb-a8d3-4808-9dbb-d3e00020623d-kube-api-access-t879d\") pod \"ce8c17bb-a8d3-4808-9dbb-d3e00020623d\" (UID: \"ce8c17bb-a8d3-4808-9dbb-d3e00020623d\") " Feb 25 12:06:04 crc kubenswrapper[4725]: I0225 12:06:04.444330 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8c17bb-a8d3-4808-9dbb-d3e00020623d-kube-api-access-t879d" (OuterVolumeSpecName: "kube-api-access-t879d") pod "ce8c17bb-a8d3-4808-9dbb-d3e00020623d" (UID: "ce8c17bb-a8d3-4808-9dbb-d3e00020623d"). InnerVolumeSpecName "kube-api-access-t879d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:06:04 crc kubenswrapper[4725]: I0225 12:06:04.533159 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t879d\" (UniqueName: \"kubernetes.io/projected/ce8c17bb-a8d3-4808-9dbb-d3e00020623d-kube-api-access-t879d\") on node \"crc\" DevicePath \"\"" Feb 25 12:06:04 crc kubenswrapper[4725]: I0225 12:06:04.940815 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533686-l277g" event={"ID":"ce8c17bb-a8d3-4808-9dbb-d3e00020623d","Type":"ContainerDied","Data":"800a3fe8984cbe87741a57e96845e9991b11740bf3cd20dd8bfeaa400789bcf9"} Feb 25 12:06:04 crc kubenswrapper[4725]: I0225 12:06:04.941226 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800a3fe8984cbe87741a57e96845e9991b11740bf3cd20dd8bfeaa400789bcf9" Feb 25 12:06:04 crc kubenswrapper[4725]: I0225 12:06:04.940908 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533686-l277g" Feb 25 12:06:05 crc kubenswrapper[4725]: I0225 12:06:05.399551 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533680-8pgr6"] Feb 25 12:06:05 crc kubenswrapper[4725]: I0225 12:06:05.417336 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533680-8pgr6"] Feb 25 12:06:07 crc kubenswrapper[4725]: I0225 12:06:07.244113 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ad2428-3759-4e3e-bbf5-07b4aab3365c" path="/var/lib/kubelet/pods/47ad2428-3759-4e3e-bbf5-07b4aab3365c/volumes" Feb 25 12:06:11 crc kubenswrapper[4725]: I0225 12:06:11.555728 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 12:06:11 crc kubenswrapper[4725]: I0225 12:06:11.558774 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 12:06:23 crc kubenswrapper[4725]: I0225 12:06:23.109699 4725 scope.go:117] "RemoveContainer" containerID="affa07606e90db95ab74b2d7772c4dbddc5867cdf03918fdc27db222bdd91e27" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.717789 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2rl2l"] Feb 25 12:06:37 crc kubenswrapper[4725]: E0225 12:06:37.719039 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8c17bb-a8d3-4808-9dbb-d3e00020623d" containerName="oc" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.719058 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8c17bb-a8d3-4808-9dbb-d3e00020623d" containerName="oc" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.719305 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8c17bb-a8d3-4808-9dbb-d3e00020623d" containerName="oc" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.721201 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.748952 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2rl2l"] Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.802534 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rklt4\" (UniqueName: \"kubernetes.io/projected/815ed8b6-8b30-4aa0-9ce3-5491586a2017-kube-api-access-rklt4\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.802606 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-catalog-content\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.802638 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-utilities\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.904371 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rklt4\" (UniqueName: \"kubernetes.io/projected/815ed8b6-8b30-4aa0-9ce3-5491586a2017-kube-api-access-rklt4\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.904474 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-catalog-content\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.904518 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-utilities\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.905019 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-utilities\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.905133 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-catalog-content\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:37 crc kubenswrapper[4725]: I0225 12:06:37.932040 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rklt4\" (UniqueName: \"kubernetes.io/projected/815ed8b6-8b30-4aa0-9ce3-5491586a2017-kube-api-access-rklt4\") pod \"certified-operators-2rl2l\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:38 crc kubenswrapper[4725]: I0225 12:06:38.046208 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:38 crc kubenswrapper[4725]: I0225 12:06:38.538055 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2rl2l"] Feb 25 12:06:39 crc kubenswrapper[4725]: I0225 12:06:39.270867 4725 generic.go:334] "Generic (PLEG): container finished" podID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerID="df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6" exitCode=0 Feb 25 12:06:39 crc kubenswrapper[4725]: I0225 12:06:39.270926 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rl2l" event={"ID":"815ed8b6-8b30-4aa0-9ce3-5491586a2017","Type":"ContainerDied","Data":"df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6"} Feb 25 12:06:39 crc kubenswrapper[4725]: I0225 12:06:39.271358 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rl2l" event={"ID":"815ed8b6-8b30-4aa0-9ce3-5491586a2017","Type":"ContainerStarted","Data":"c1dea8163c4a03e63f6d4535caa36b6642dcf55b38b8ae0f3f6788d5a2888f01"} Feb 25 12:06:40 crc kubenswrapper[4725]: I0225 12:06:40.282693 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rl2l" event={"ID":"815ed8b6-8b30-4aa0-9ce3-5491586a2017","Type":"ContainerStarted","Data":"f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d"} Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.304813 4725 generic.go:334] "Generic (PLEG): container finished" podID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerID="f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d" exitCode=0 Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.305124 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rl2l" event={"ID":"815ed8b6-8b30-4aa0-9ce3-5491586a2017","Type":"ContainerDied","Data":"f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d"} Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.308592 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.555609 4725 patch_prober.go:28] interesting pod/machine-config-daemon-256sf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.555680 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.555729 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-256sf" Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.556453 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b"} pod="openshift-machine-config-operator/machine-config-daemon-256sf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 12:06:41 crc kubenswrapper[4725]: I0225 12:06:41.556531 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerName="machine-config-daemon" containerID="cri-o://b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" gracePeriod=600 Feb 25 12:06:41 crc kubenswrapper[4725]: E0225 12:06:41.790061 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:06:42 crc kubenswrapper[4725]: I0225 12:06:42.320755 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rl2l" event={"ID":"815ed8b6-8b30-4aa0-9ce3-5491586a2017","Type":"ContainerStarted","Data":"5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837"} Feb 25 12:06:42 crc kubenswrapper[4725]: I0225 12:06:42.324148 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4742f60-e555-4f96-be12-b9e46a857bd4" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" exitCode=0 Feb 25 12:06:42 crc kubenswrapper[4725]: I0225 12:06:42.324193 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerDied","Data":"b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b"} Feb 25 12:06:42 crc kubenswrapper[4725]: I0225 12:06:42.324222 4725 scope.go:117] "RemoveContainer" containerID="b036620e875f4a758dd804181c8957fd14a1029d422786a3424f55fe7e40b96c" Feb 25 12:06:42 crc kubenswrapper[4725]: I0225 12:06:42.324808 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:06:42 crc kubenswrapper[4725]: E0225 12:06:42.325224 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:06:42 crc kubenswrapper[4725]: I0225 12:06:42.347494 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2rl2l" podStartSLOduration=2.696632511 podStartE2EDuration="5.347450923s" podCreationTimestamp="2026-02-25 12:06:37 +0000 UTC" firstStartedPulling="2026-02-25 12:06:39.272886026 +0000 UTC m=+4424.771468051" lastFinishedPulling="2026-02-25 12:06:41.923704438 +0000 UTC m=+4427.422286463" observedRunningTime="2026-02-25 12:06:42.337126469 +0000 UTC m=+4427.835708514" watchObservedRunningTime="2026-02-25 12:06:42.347450923 +0000 UTC m=+4427.846032948" Feb 25 12:06:48 crc kubenswrapper[4725]: I0225 12:06:48.046863 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:48 crc kubenswrapper[4725]: I0225 12:06:48.047421 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:48 crc kubenswrapper[4725]: I0225 12:06:48.118365 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:48 crc kubenswrapper[4725]: I0225 12:06:48.457471 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:48 crc kubenswrapper[4725]: I0225 12:06:48.519181 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2rl2l"] Feb 25 12:06:50 crc kubenswrapper[4725]: I0225 12:06:50.403975 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2rl2l" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="registry-server" containerID="cri-o://5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837" gracePeriod=2 Feb 25 12:06:50 crc kubenswrapper[4725]: I0225 12:06:50.896692 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:50 crc kubenswrapper[4725]: I0225 12:06:50.985871 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-utilities\") pod \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " Feb 25 12:06:50 crc kubenswrapper[4725]: I0225 12:06:50.986165 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-catalog-content\") pod \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " Feb 25 12:06:50 crc kubenswrapper[4725]: I0225 12:06:50.986198 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rklt4\" (UniqueName: \"kubernetes.io/projected/815ed8b6-8b30-4aa0-9ce3-5491586a2017-kube-api-access-rklt4\") pod \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\" (UID: \"815ed8b6-8b30-4aa0-9ce3-5491586a2017\") " Feb 25 12:06:50 crc kubenswrapper[4725]: I0225 12:06:50.987029 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-utilities" (OuterVolumeSpecName: "utilities") pod "815ed8b6-8b30-4aa0-9ce3-5491586a2017" (UID: "815ed8b6-8b30-4aa0-9ce3-5491586a2017"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.005778 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815ed8b6-8b30-4aa0-9ce3-5491586a2017-kube-api-access-rklt4" (OuterVolumeSpecName: "kube-api-access-rklt4") pod "815ed8b6-8b30-4aa0-9ce3-5491586a2017" (UID: "815ed8b6-8b30-4aa0-9ce3-5491586a2017"). InnerVolumeSpecName "kube-api-access-rklt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.088671 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rklt4\" (UniqueName: \"kubernetes.io/projected/815ed8b6-8b30-4aa0-9ce3-5491586a2017-kube-api-access-rklt4\") on node \"crc\" DevicePath \"\"" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.088709 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.419844 4725 generic.go:334] "Generic (PLEG): container finished" podID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerID="5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837" exitCode=0 Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.419903 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rl2l" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.419951 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rl2l" event={"ID":"815ed8b6-8b30-4aa0-9ce3-5491586a2017","Type":"ContainerDied","Data":"5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837"} Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.420053 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rl2l" event={"ID":"815ed8b6-8b30-4aa0-9ce3-5491586a2017","Type":"ContainerDied","Data":"c1dea8163c4a03e63f6d4535caa36b6642dcf55b38b8ae0f3f6788d5a2888f01"} Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.420089 4725 scope.go:117] "RemoveContainer" containerID="5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.448039 4725 scope.go:117] "RemoveContainer" containerID="f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.486175 4725 scope.go:117] "RemoveContainer" containerID="df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.546237 4725 scope.go:117] "RemoveContainer" containerID="5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837" Feb 25 12:06:51 crc kubenswrapper[4725]: E0225 12:06:51.546654 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837\": container with ID starting with 5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837 not found: ID does not exist" containerID="5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.546684 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837"} err="failed to get container status \"5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837\": rpc error: code = NotFound desc = could not find container \"5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837\": container with ID starting with 5b49f02578cfb759aa1ad7b8895a66b3641fff33b06710c9788654169ae60837 not found: ID does not exist" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.546707 4725 scope.go:117] "RemoveContainer" containerID="f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d" Feb 25 12:06:51 crc kubenswrapper[4725]: E0225 12:06:51.546966 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d\": container with ID starting with f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d not found: ID does not exist" containerID="f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.547028 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d"} err="failed to get container status \"f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d\": rpc error: code = NotFound desc = could not find container \"f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d\": container with ID starting with f708036666ab5d11b778cd9e0472428e5e5a0cf5c3260139739eb9080e684c9d not found: ID does not exist" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.547050 4725 scope.go:117] "RemoveContainer" containerID="df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6" Feb 25 12:06:51 crc kubenswrapper[4725]: E0225 12:06:51.547241 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6\": container with ID starting with df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6 not found: ID does not exist" containerID="df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.547262 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6"} err="failed to get container status \"df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6\": rpc error: code = NotFound desc = could not find container \"df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6\": container with ID starting with df8611545e1b55d2c79927c7bb31804b82175c88e6ea4c9735e2c3e54ee8cfe6 not found: ID does not exist" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.696246 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "815ed8b6-8b30-4aa0-9ce3-5491586a2017" (UID: "815ed8b6-8b30-4aa0-9ce3-5491586a2017"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.700007 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815ed8b6-8b30-4aa0-9ce3-5491586a2017-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.776236 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2rl2l"] Feb 25 12:06:51 crc kubenswrapper[4725]: I0225 12:06:51.791484 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2rl2l"] Feb 25 12:06:53 crc kubenswrapper[4725]: I0225 12:06:53.239119 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" path="/var/lib/kubelet/pods/815ed8b6-8b30-4aa0-9ce3-5491586a2017/volumes" Feb 25 12:06:54 crc kubenswrapper[4725]: I0225 12:06:54.224992 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:06:54 crc kubenswrapper[4725]: E0225 12:06:54.225487 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:07:08 crc kubenswrapper[4725]: I0225 12:07:08.224848 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:07:08 crc kubenswrapper[4725]: E0225 12:07:08.226124 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:07:21 crc kubenswrapper[4725]: I0225 12:07:21.761552 4725 generic.go:334] "Generic (PLEG): container finished" podID="b395cf46-cc41-434e-be61-6104918005b0" containerID="3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c" exitCode=0 Feb 25 12:07:21 crc kubenswrapper[4725]: I0225 12:07:21.761695 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" event={"ID":"b395cf46-cc41-434e-be61-6104918005b0","Type":"ContainerDied","Data":"3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c"} Feb 25 12:07:21 crc kubenswrapper[4725]: I0225 12:07:21.762624 4725 scope.go:117] "RemoveContainer" containerID="3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c" Feb 25 12:07:22 crc kubenswrapper[4725]: I0225 12:07:22.077169 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ghxvx_must-gather-4jxd6_b395cf46-cc41-434e-be61-6104918005b0/gather/0.log" Feb 25 12:07:22 crc kubenswrapper[4725]: I0225 12:07:22.224192 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:07:22 crc kubenswrapper[4725]: E0225 12:07:22.224558 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.126339 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8mnt"] Feb 25 12:07:27 crc kubenswrapper[4725]: E0225 12:07:27.127279 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="extract-content" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.127295 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="extract-content" Feb 25 12:07:27 crc kubenswrapper[4725]: E0225 12:07:27.127319 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="registry-server" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.127328 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="registry-server" Feb 25 12:07:27 crc kubenswrapper[4725]: E0225 12:07:27.127352 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="extract-utilities" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.127362 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="extract-utilities" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.127591 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="815ed8b6-8b30-4aa0-9ce3-5491586a2017" containerName="registry-server" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.130227 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.151443 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8mnt"] Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.335586 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b4l7\" (UniqueName: \"kubernetes.io/projected/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-kube-api-access-5b4l7\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.335640 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-catalog-content\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.336523 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-utilities\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.438862 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b4l7\" (UniqueName: \"kubernetes.io/projected/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-kube-api-access-5b4l7\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.439255 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-catalog-content\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.439543 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-utilities\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.439915 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-catalog-content\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.439969 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-utilities\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.457581 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b4l7\" (UniqueName: \"kubernetes.io/projected/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-kube-api-access-5b4l7\") pod \"redhat-operators-n8mnt\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:27 crc kubenswrapper[4725]: I0225 12:07:27.750156 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:28 crc kubenswrapper[4725]: I0225 12:07:28.225795 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8mnt"] Feb 25 12:07:28 crc kubenswrapper[4725]: I0225 12:07:28.834472 4725 generic.go:334] "Generic (PLEG): container finished" podID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerID="06218cfe0a1c707717c79ed43b7745e04ffc6824e7726cf21488d6dbf004cd76" exitCode=0 Feb 25 12:07:28 crc kubenswrapper[4725]: I0225 12:07:28.834553 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8mnt" event={"ID":"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb","Type":"ContainerDied","Data":"06218cfe0a1c707717c79ed43b7745e04ffc6824e7726cf21488d6dbf004cd76"} Feb 25 12:07:28 crc kubenswrapper[4725]: I0225 12:07:28.834912 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8mnt" event={"ID":"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb","Type":"ContainerStarted","Data":"c36c3d986a832e45089513d1d012ee93840c6a96d7e8f1ced06ea3d1b0a07689"} Feb 25 12:07:29 crc kubenswrapper[4725]: I0225 12:07:29.847207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8mnt" event={"ID":"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb","Type":"ContainerStarted","Data":"6d0d538210a0668b44bb451833c631655cebcbc847e5b2579871079bcf4e455f"} Feb 25 12:07:30 crc kubenswrapper[4725]: I0225 12:07:30.863637 4725 generic.go:334] "Generic (PLEG): container finished" podID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerID="6d0d538210a0668b44bb451833c631655cebcbc847e5b2579871079bcf4e455f" exitCode=0 Feb 25 12:07:30 crc kubenswrapper[4725]: I0225 12:07:30.863715 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8mnt" event={"ID":"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb","Type":"ContainerDied","Data":"6d0d538210a0668b44bb451833c631655cebcbc847e5b2579871079bcf4e455f"} Feb 25 12:07:31 crc kubenswrapper[4725]: I0225 12:07:31.875218 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8mnt" event={"ID":"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb","Type":"ContainerStarted","Data":"637d6da238353c3e4ad6092991b48b24878ffe273d250b567c31769853156f39"} Feb 25 12:07:31 crc kubenswrapper[4725]: I0225 12:07:31.916150 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8mnt" podStartSLOduration=2.46000786 podStartE2EDuration="4.916122806s" podCreationTimestamp="2026-02-25 12:07:27 +0000 UTC" firstStartedPulling="2026-02-25 12:07:28.840041287 +0000 UTC m=+4474.338623312" lastFinishedPulling="2026-02-25 12:07:31.296156233 +0000 UTC m=+4476.794738258" observedRunningTime="2026-02-25 12:07:31.902105834 +0000 UTC m=+4477.400687899" watchObservedRunningTime="2026-02-25 12:07:31.916122806 +0000 UTC m=+4477.414704841" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.245009 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ghxvx/must-gather-4jxd6"] Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.245268 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" podUID="b395cf46-cc41-434e-be61-6104918005b0" containerName="copy" containerID="cri-o://97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d" gracePeriod=2 Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.261006 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ghxvx/must-gather-4jxd6"] Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.766266 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ghxvx_must-gather-4jxd6_b395cf46-cc41-434e-be61-6104918005b0/copy/0.log" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.766817 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.849000 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b395cf46-cc41-434e-be61-6104918005b0-must-gather-output\") pod \"b395cf46-cc41-434e-be61-6104918005b0\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.849079 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ll2x\" (UniqueName: \"kubernetes.io/projected/b395cf46-cc41-434e-be61-6104918005b0-kube-api-access-8ll2x\") pod \"b395cf46-cc41-434e-be61-6104918005b0\" (UID: \"b395cf46-cc41-434e-be61-6104918005b0\") " Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.855289 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b395cf46-cc41-434e-be61-6104918005b0-kube-api-access-8ll2x" (OuterVolumeSpecName: "kube-api-access-8ll2x") pod "b395cf46-cc41-434e-be61-6104918005b0" (UID: "b395cf46-cc41-434e-be61-6104918005b0"). InnerVolumeSpecName "kube-api-access-8ll2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.884709 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ghxvx_must-gather-4jxd6_b395cf46-cc41-434e-be61-6104918005b0/copy/0.log" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.885148 4725 generic.go:334] "Generic (PLEG): container finished" podID="b395cf46-cc41-434e-be61-6104918005b0" containerID="97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d" exitCode=143 Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.885194 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ghxvx/must-gather-4jxd6" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.885248 4725 scope.go:117] "RemoveContainer" containerID="97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.908559 4725 scope.go:117] "RemoveContainer" containerID="3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.943121 4725 scope.go:117] "RemoveContainer" containerID="97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d" Feb 25 12:07:32 crc kubenswrapper[4725]: E0225 12:07:32.944126 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d\": container with ID starting with 97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d not found: ID does not exist" containerID="97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.944171 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d"} err="failed to get container status \"97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d\": rpc error: code = NotFound desc = could not find container \"97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d\": container with ID starting with 97b7702dddd277a10f8da440e00741b97e518c1db6297c5d11d1440cfad2839d not found: ID does not exist" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.944201 4725 scope.go:117] "RemoveContainer" containerID="3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c" Feb 25 12:07:32 crc kubenswrapper[4725]: E0225 12:07:32.944653 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c\": container with ID starting with 3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c not found: ID does not exist" containerID="3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.944696 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c"} err="failed to get container status \"3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c\": rpc error: code = NotFound desc = could not find container \"3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c\": container with ID starting with 3fcb99d0a319ffbc4f59c0f75d6b6d025dbe4bb278a0296943534d251751bd6c not found: ID does not exist" Feb 25 12:07:32 crc kubenswrapper[4725]: I0225 12:07:32.950967 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ll2x\" (UniqueName: \"kubernetes.io/projected/b395cf46-cc41-434e-be61-6104918005b0-kube-api-access-8ll2x\") on node \"crc\" DevicePath \"\"" Feb 25 12:07:33 crc kubenswrapper[4725]: I0225 12:07:33.015864 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b395cf46-cc41-434e-be61-6104918005b0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b395cf46-cc41-434e-be61-6104918005b0" (UID: "b395cf46-cc41-434e-be61-6104918005b0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:07:33 crc kubenswrapper[4725]: I0225 12:07:33.052877 4725 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b395cf46-cc41-434e-be61-6104918005b0-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 25 12:07:33 crc kubenswrapper[4725]: I0225 12:07:33.283510 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b395cf46-cc41-434e-be61-6104918005b0" path="/var/lib/kubelet/pods/b395cf46-cc41-434e-be61-6104918005b0/volumes" Feb 25 12:07:35 crc kubenswrapper[4725]: I0225 12:07:35.230621 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:07:35 crc kubenswrapper[4725]: E0225 12:07:35.231646 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:07:37 crc kubenswrapper[4725]: I0225 12:07:37.750586 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:37 crc kubenswrapper[4725]: I0225 12:07:37.750949 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:37 crc kubenswrapper[4725]: I0225 12:07:37.801926 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:37 crc kubenswrapper[4725]: I0225 12:07:37.974270 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:41 crc kubenswrapper[4725]: I0225 12:07:41.105212 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8mnt"] Feb 25 12:07:41 crc kubenswrapper[4725]: I0225 12:07:41.105991 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8mnt" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="registry-server" containerID="cri-o://637d6da238353c3e4ad6092991b48b24878ffe273d250b567c31769853156f39" gracePeriod=2 Feb 25 12:07:41 crc kubenswrapper[4725]: I0225 12:07:41.966214 4725 generic.go:334] "Generic (PLEG): container finished" podID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerID="637d6da238353c3e4ad6092991b48b24878ffe273d250b567c31769853156f39" exitCode=0 Feb 25 12:07:41 crc kubenswrapper[4725]: I0225 12:07:41.966267 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8mnt" event={"ID":"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb","Type":"ContainerDied","Data":"637d6da238353c3e4ad6092991b48b24878ffe273d250b567c31769853156f39"} Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.191813 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.344526 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-catalog-content\") pod \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.345106 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-utilities\") pod \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.345619 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b4l7\" (UniqueName: \"kubernetes.io/projected/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-kube-api-access-5b4l7\") pod \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\" (UID: \"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb\") " Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.346095 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-utilities" (OuterVolumeSpecName: "utilities") pod "557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" (UID: "557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.346745 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.350229 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-kube-api-access-5b4l7" (OuterVolumeSpecName: "kube-api-access-5b4l7") pod "557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" (UID: "557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb"). InnerVolumeSpecName "kube-api-access-5b4l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.449014 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b4l7\" (UniqueName: \"kubernetes.io/projected/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-kube-api-access-5b4l7\") on node \"crc\" DevicePath \"\"" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.491761 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" (UID: "557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.550973 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.977340 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8mnt" event={"ID":"557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb","Type":"ContainerDied","Data":"c36c3d986a832e45089513d1d012ee93840c6a96d7e8f1ced06ea3d1b0a07689"} Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.977406 4725 scope.go:117] "RemoveContainer" containerID="637d6da238353c3e4ad6092991b48b24878ffe273d250b567c31769853156f39" Feb 25 12:07:42 crc kubenswrapper[4725]: I0225 12:07:42.977406 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8mnt" Feb 25 12:07:43 crc kubenswrapper[4725]: I0225 12:07:43.003544 4725 scope.go:117] "RemoveContainer" containerID="6d0d538210a0668b44bb451833c631655cebcbc847e5b2579871079bcf4e455f" Feb 25 12:07:43 crc kubenswrapper[4725]: I0225 12:07:43.015158 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8mnt"] Feb 25 12:07:43 crc kubenswrapper[4725]: I0225 12:07:43.022498 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8mnt"] Feb 25 12:07:43 crc kubenswrapper[4725]: I0225 12:07:43.048884 4725 scope.go:117] "RemoveContainer" containerID="06218cfe0a1c707717c79ed43b7745e04ffc6824e7726cf21488d6dbf004cd76" Feb 25 12:07:43 crc kubenswrapper[4725]: I0225 12:07:43.235272 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" path="/var/lib/kubelet/pods/557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb/volumes" Feb 25 12:07:49 crc kubenswrapper[4725]: I0225 12:07:49.226479 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:07:49 crc kubenswrapper[4725]: E0225 12:07:49.230670 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.144487 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533688-c95v7"] Feb 25 12:08:00 crc kubenswrapper[4725]: E0225 12:08:00.146748 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="registry-server" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.146880 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="registry-server" Feb 25 12:08:00 crc kubenswrapper[4725]: E0225 12:08:00.146982 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b395cf46-cc41-434e-be61-6104918005b0" containerName="gather" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.147070 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b395cf46-cc41-434e-be61-6104918005b0" containerName="gather" Feb 25 12:08:00 crc kubenswrapper[4725]: E0225 12:08:00.147169 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="extract-utilities" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.147260 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="extract-utilities" Feb 25 12:08:00 crc kubenswrapper[4725]: E0225 12:08:00.147358 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b395cf46-cc41-434e-be61-6104918005b0" containerName="copy" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.147438 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b395cf46-cc41-434e-be61-6104918005b0" containerName="copy" Feb 25 12:08:00 crc kubenswrapper[4725]: E0225 12:08:00.147531 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="extract-content" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.147617 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="extract-content" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.148199 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b395cf46-cc41-434e-be61-6104918005b0" containerName="copy" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.148334 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="557a8a3d-b7c3-4c4a-919d-db2c0b53a2cb" containerName="registry-server" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.148445 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b395cf46-cc41-434e-be61-6104918005b0" containerName="gather" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.149486 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533688-c95v7" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.151651 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.152020 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.152542 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.156871 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533688-c95v7"] Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.262110 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m4p5\" (UniqueName: \"kubernetes.io/projected/7f79d3d0-0eec-4585-979e-0c00e26815a4-kube-api-access-8m4p5\") pod \"auto-csr-approver-29533688-c95v7\" (UID: \"7f79d3d0-0eec-4585-979e-0c00e26815a4\") " pod="openshift-infra/auto-csr-approver-29533688-c95v7" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.364246 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m4p5\" (UniqueName: \"kubernetes.io/projected/7f79d3d0-0eec-4585-979e-0c00e26815a4-kube-api-access-8m4p5\") pod \"auto-csr-approver-29533688-c95v7\" (UID: \"7f79d3d0-0eec-4585-979e-0c00e26815a4\") " pod="openshift-infra/auto-csr-approver-29533688-c95v7" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.390705 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m4p5\" (UniqueName: \"kubernetes.io/projected/7f79d3d0-0eec-4585-979e-0c00e26815a4-kube-api-access-8m4p5\") pod \"auto-csr-approver-29533688-c95v7\" (UID: \"7f79d3d0-0eec-4585-979e-0c00e26815a4\") " pod="openshift-infra/auto-csr-approver-29533688-c95v7" Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.469167 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533688-c95v7" Feb 25 12:08:00 crc kubenswrapper[4725]: W0225 12:08:00.924661 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f79d3d0_0eec_4585_979e_0c00e26815a4.slice/crio-706974eeaa8326cb6c510f7443eebade9c68afa6a44854321d880519fb4f9cdb WatchSource:0}: Error finding container 706974eeaa8326cb6c510f7443eebade9c68afa6a44854321d880519fb4f9cdb: Status 404 returned error can't find the container with id 706974eeaa8326cb6c510f7443eebade9c68afa6a44854321d880519fb4f9cdb Feb 25 12:08:00 crc kubenswrapper[4725]: I0225 12:08:00.929046 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533688-c95v7"] Feb 25 12:08:01 crc kubenswrapper[4725]: I0225 12:08:01.154165 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533688-c95v7" event={"ID":"7f79d3d0-0eec-4585-979e-0c00e26815a4","Type":"ContainerStarted","Data":"706974eeaa8326cb6c510f7443eebade9c68afa6a44854321d880519fb4f9cdb"} Feb 25 12:08:03 crc kubenswrapper[4725]: I0225 12:08:03.171800 4725 generic.go:334] "Generic (PLEG): container finished" podID="7f79d3d0-0eec-4585-979e-0c00e26815a4" containerID="e9fd9ae0b21a13867c77fa033aa93990a0a5a1e5597350932aacfc4b3d68d1de" exitCode=0 Feb 25 12:08:03 crc kubenswrapper[4725]: I0225 12:08:03.172931 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533688-c95v7" event={"ID":"7f79d3d0-0eec-4585-979e-0c00e26815a4","Type":"ContainerDied","Data":"e9fd9ae0b21a13867c77fa033aa93990a0a5a1e5597350932aacfc4b3d68d1de"} Feb 25 12:08:03 crc kubenswrapper[4725]: I0225 12:08:03.224614 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:08:03 crc kubenswrapper[4725]: E0225 12:08:03.225072 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:08:04 crc kubenswrapper[4725]: I0225 12:08:04.495661 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533688-c95v7" Feb 25 12:08:04 crc kubenswrapper[4725]: I0225 12:08:04.643897 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m4p5\" (UniqueName: \"kubernetes.io/projected/7f79d3d0-0eec-4585-979e-0c00e26815a4-kube-api-access-8m4p5\") pod \"7f79d3d0-0eec-4585-979e-0c00e26815a4\" (UID: \"7f79d3d0-0eec-4585-979e-0c00e26815a4\") " Feb 25 12:08:04 crc kubenswrapper[4725]: I0225 12:08:04.649626 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f79d3d0-0eec-4585-979e-0c00e26815a4-kube-api-access-8m4p5" (OuterVolumeSpecName: "kube-api-access-8m4p5") pod "7f79d3d0-0eec-4585-979e-0c00e26815a4" (UID: "7f79d3d0-0eec-4585-979e-0c00e26815a4"). InnerVolumeSpecName "kube-api-access-8m4p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:08:04 crc kubenswrapper[4725]: I0225 12:08:04.746670 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m4p5\" (UniqueName: \"kubernetes.io/projected/7f79d3d0-0eec-4585-979e-0c00e26815a4-kube-api-access-8m4p5\") on node \"crc\" DevicePath \"\"" Feb 25 12:08:05 crc kubenswrapper[4725]: I0225 12:08:05.189737 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533688-c95v7" event={"ID":"7f79d3d0-0eec-4585-979e-0c00e26815a4","Type":"ContainerDied","Data":"706974eeaa8326cb6c510f7443eebade9c68afa6a44854321d880519fb4f9cdb"} Feb 25 12:08:05 crc kubenswrapper[4725]: I0225 12:08:05.190053 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="706974eeaa8326cb6c510f7443eebade9c68afa6a44854321d880519fb4f9cdb" Feb 25 12:08:05 crc kubenswrapper[4725]: I0225 12:08:05.189790 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533688-c95v7" Feb 25 12:08:05 crc kubenswrapper[4725]: I0225 12:08:05.570349 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533682-nsxgv"] Feb 25 12:08:05 crc kubenswrapper[4725]: I0225 12:08:05.577800 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533682-nsxgv"] Feb 25 12:08:07 crc kubenswrapper[4725]: I0225 12:08:07.235967 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a075f866-c9c5-4c21-b642-570305f1cbd7" path="/var/lib/kubelet/pods/a075f866-c9c5-4c21-b642-570305f1cbd7/volumes" Feb 25 12:08:18 crc kubenswrapper[4725]: I0225 12:08:18.225370 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:08:18 crc kubenswrapper[4725]: E0225 12:08:18.226489 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:08:23 crc kubenswrapper[4725]: I0225 12:08:23.289592 4725 scope.go:117] "RemoveContainer" containerID="9ac2f59219692a3d9243cfcce0deb70d8b971e54af453ceeac445fa6fb8c6e52" Feb 25 12:08:23 crc kubenswrapper[4725]: I0225 12:08:23.330513 4725 scope.go:117] "RemoveContainer" containerID="9c86e260d8620a12faac240fb65889d862e71728642a45ab5c876038ae5e279b" Feb 25 12:08:23 crc kubenswrapper[4725]: I0225 12:08:23.369119 4725 scope.go:117] "RemoveContainer" containerID="ffec85d015b9db13f1855e940a79d65632be38c51e6adb5476af43c053aa1fcf" Feb 25 12:08:23 crc kubenswrapper[4725]: I0225 12:08:23.418350 4725 scope.go:117] "RemoveContainer" containerID="28a1c5a50baa6df28d366961a9e3b1720c250b4fb3cf493a8e24defaf397ba20" Feb 25 12:08:29 crc kubenswrapper[4725]: I0225 12:08:29.225750 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:08:29 crc kubenswrapper[4725]: E0225 12:08:29.226775 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:08:41 crc kubenswrapper[4725]: I0225 12:08:41.225404 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:08:41 crc kubenswrapper[4725]: E0225 12:08:41.226489 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:08:53 crc kubenswrapper[4725]: I0225 12:08:53.225282 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:08:53 crc kubenswrapper[4725]: E0225 12:08:53.226290 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:09:05 crc kubenswrapper[4725]: I0225 12:09:05.232549 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:09:05 crc kubenswrapper[4725]: E0225 12:09:05.233440 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:09:20 crc kubenswrapper[4725]: I0225 12:09:20.224120 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:09:20 crc kubenswrapper[4725]: E0225 12:09:20.224909 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:09:35 crc kubenswrapper[4725]: I0225 12:09:35.235144 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:09:35 crc kubenswrapper[4725]: E0225 12:09:35.236326 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:09:49 crc kubenswrapper[4725]: I0225 12:09:49.224811 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:09:49 crc kubenswrapper[4725]: E0225 12:09:49.226382 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.174157 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533690-b524w"] Feb 25 12:10:00 crc kubenswrapper[4725]: E0225 12:10:00.175645 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f79d3d0-0eec-4585-979e-0c00e26815a4" containerName="oc" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.175674 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f79d3d0-0eec-4585-979e-0c00e26815a4" containerName="oc" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.176106 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f79d3d0-0eec-4585-979e-0c00e26815a4" containerName="oc" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.177471 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533690-b524w" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.186269 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.187249 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.187450 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.192482 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533690-b524w"] Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.275046 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22qdn\" (UniqueName: \"kubernetes.io/projected/d90826c8-d6f3-43ed-9fdb-bb1454f157bd-kube-api-access-22qdn\") pod \"auto-csr-approver-29533690-b524w\" (UID: \"d90826c8-d6f3-43ed-9fdb-bb1454f157bd\") " pod="openshift-infra/auto-csr-approver-29533690-b524w" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.376774 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22qdn\" (UniqueName: \"kubernetes.io/projected/d90826c8-d6f3-43ed-9fdb-bb1454f157bd-kube-api-access-22qdn\") pod \"auto-csr-approver-29533690-b524w\" (UID: \"d90826c8-d6f3-43ed-9fdb-bb1454f157bd\") " pod="openshift-infra/auto-csr-approver-29533690-b524w" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.407452 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22qdn\" (UniqueName: \"kubernetes.io/projected/d90826c8-d6f3-43ed-9fdb-bb1454f157bd-kube-api-access-22qdn\") pod \"auto-csr-approver-29533690-b524w\" (UID: \"d90826c8-d6f3-43ed-9fdb-bb1454f157bd\") " pod="openshift-infra/auto-csr-approver-29533690-b524w" Feb 25 12:10:00 crc kubenswrapper[4725]: I0225 12:10:00.513569 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533690-b524w" Feb 25 12:10:01 crc kubenswrapper[4725]: I0225 12:10:01.044301 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533690-b524w"] Feb 25 12:10:01 crc kubenswrapper[4725]: I0225 12:10:01.224409 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:10:01 crc kubenswrapper[4725]: E0225 12:10:01.225582 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:10:01 crc kubenswrapper[4725]: I0225 12:10:01.486925 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533690-b524w" event={"ID":"d90826c8-d6f3-43ed-9fdb-bb1454f157bd","Type":"ContainerStarted","Data":"664b2e919746f112944e7f50179f6c3b70b2a95802b833164f5eab81a0ad8a05"} Feb 25 12:10:03 crc kubenswrapper[4725]: I0225 12:10:03.509451 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533690-b524w" event={"ID":"d90826c8-d6f3-43ed-9fdb-bb1454f157bd","Type":"ContainerStarted","Data":"e7917ce41f09c9d6dba80438288d7c99d3b0069b9c5daaa47c722060d5ef1bec"} Feb 25 12:10:03 crc kubenswrapper[4725]: I0225 12:10:03.534579 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533690-b524w" podStartSLOduration=1.5855923490000001 podStartE2EDuration="3.534560084s" podCreationTimestamp="2026-02-25 12:10:00 +0000 UTC" firstStartedPulling="2026-02-25 12:10:01.053093856 +0000 UTC m=+4626.551675891" lastFinishedPulling="2026-02-25 12:10:03.002061581 +0000 UTC m=+4628.500643626" observedRunningTime="2026-02-25 12:10:03.527325832 +0000 UTC m=+4629.025907947" watchObservedRunningTime="2026-02-25 12:10:03.534560084 +0000 UTC m=+4629.033142109" Feb 25 12:10:04 crc kubenswrapper[4725]: I0225 12:10:04.520946 4725 generic.go:334] "Generic (PLEG): container finished" podID="d90826c8-d6f3-43ed-9fdb-bb1454f157bd" containerID="e7917ce41f09c9d6dba80438288d7c99d3b0069b9c5daaa47c722060d5ef1bec" exitCode=0 Feb 25 12:10:04 crc kubenswrapper[4725]: I0225 12:10:04.521022 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533690-b524w" event={"ID":"d90826c8-d6f3-43ed-9fdb-bb1454f157bd","Type":"ContainerDied","Data":"e7917ce41f09c9d6dba80438288d7c99d3b0069b9c5daaa47c722060d5ef1bec"} Feb 25 12:10:05 crc kubenswrapper[4725]: I0225 12:10:05.924345 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533690-b524w" Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.102636 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22qdn\" (UniqueName: \"kubernetes.io/projected/d90826c8-d6f3-43ed-9fdb-bb1454f157bd-kube-api-access-22qdn\") pod \"d90826c8-d6f3-43ed-9fdb-bb1454f157bd\" (UID: \"d90826c8-d6f3-43ed-9fdb-bb1454f157bd\") " Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.112754 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90826c8-d6f3-43ed-9fdb-bb1454f157bd-kube-api-access-22qdn" (OuterVolumeSpecName: "kube-api-access-22qdn") pod "d90826c8-d6f3-43ed-9fdb-bb1454f157bd" (UID: "d90826c8-d6f3-43ed-9fdb-bb1454f157bd"). InnerVolumeSpecName "kube-api-access-22qdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.205446 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22qdn\" (UniqueName: \"kubernetes.io/projected/d90826c8-d6f3-43ed-9fdb-bb1454f157bd-kube-api-access-22qdn\") on node \"crc\" DevicePath \"\"" Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.610108 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533690-b524w" event={"ID":"d90826c8-d6f3-43ed-9fdb-bb1454f157bd","Type":"ContainerDied","Data":"664b2e919746f112944e7f50179f6c3b70b2a95802b833164f5eab81a0ad8a05"} Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.610469 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664b2e919746f112944e7f50179f6c3b70b2a95802b833164f5eab81a0ad8a05" Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.610579 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533690-b524w" Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.637975 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533684-8k7pj"] Feb 25 12:10:06 crc kubenswrapper[4725]: I0225 12:10:06.647431 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533684-8k7pj"] Feb 25 12:10:07 crc kubenswrapper[4725]: I0225 12:10:07.242723 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdd6fbf-7184-4d92-a775-42d116b9491a" path="/var/lib/kubelet/pods/2cdd6fbf-7184-4d92-a775-42d116b9491a/volumes" Feb 25 12:10:13 crc kubenswrapper[4725]: I0225 12:10:13.226114 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:10:13 crc kubenswrapper[4725]: E0225 12:10:13.227091 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:10:23 crc kubenswrapper[4725]: I0225 12:10:23.544527 4725 scope.go:117] "RemoveContainer" containerID="07a354ad90a6a6a96002772adaed084e914c2e7826509075f6b4becd39b3f7c8" Feb 25 12:10:24 crc kubenswrapper[4725]: I0225 12:10:24.226572 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:10:24 crc kubenswrapper[4725]: E0225 12:10:24.226939 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:10:38 crc kubenswrapper[4725]: I0225 12:10:38.224894 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:10:38 crc kubenswrapper[4725]: E0225 12:10:38.226132 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:10:51 crc kubenswrapper[4725]: I0225 12:10:51.225455 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:10:51 crc kubenswrapper[4725]: E0225 12:10:51.226583 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:11:06 crc kubenswrapper[4725]: I0225 12:11:06.225083 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:11:06 crc kubenswrapper[4725]: E0225 12:11:06.227316 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:11:21 crc kubenswrapper[4725]: I0225 12:11:21.224778 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:11:21 crc kubenswrapper[4725]: E0225 12:11:21.225700 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:11:34 crc kubenswrapper[4725]: I0225 12:11:34.225609 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:11:34 crc kubenswrapper[4725]: E0225 12:11:34.226672 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-256sf_openshift-machine-config-operator(c4742f60-e555-4f96-be12-b9e46a857bd4)\"" pod="openshift-machine-config-operator/machine-config-daemon-256sf" podUID="c4742f60-e555-4f96-be12-b9e46a857bd4" Feb 25 12:11:45 crc kubenswrapper[4725]: I0225 12:11:45.241184 4725 scope.go:117] "RemoveContainer" containerID="b835936ab4c18ebfb53ec28f02f869f07a452335a11aeaeeb14955eac3653a4b" Feb 25 12:11:45 crc kubenswrapper[4725]: I0225 12:11:45.788776 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-256sf" event={"ID":"c4742f60-e555-4f96-be12-b9e46a857bd4","Type":"ContainerStarted","Data":"bb63f75c0f4809f70d62d3dcb657a367a30e2e79e5a7e8498f18ef4c1d4a59c0"} Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.173066 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533692-6v4zs"] Feb 25 12:12:00 crc kubenswrapper[4725]: E0225 12:12:00.174680 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90826c8-d6f3-43ed-9fdb-bb1454f157bd" containerName="oc" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.174712 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90826c8-d6f3-43ed-9fdb-bb1454f157bd" containerName="oc" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.175217 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90826c8-d6f3-43ed-9fdb-bb1454f157bd" containerName="oc" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.176520 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533692-6v4zs" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.178561 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mt7bb" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.181041 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.181040 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.185176 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533692-6v4zs"] Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.300968 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk6wj\" (UniqueName: \"kubernetes.io/projected/d745be80-4f6d-45af-a161-bde2cf4025e5-kube-api-access-bk6wj\") pod \"auto-csr-approver-29533692-6v4zs\" (UID: \"d745be80-4f6d-45af-a161-bde2cf4025e5\") " pod="openshift-infra/auto-csr-approver-29533692-6v4zs" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.402752 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk6wj\" (UniqueName: \"kubernetes.io/projected/d745be80-4f6d-45af-a161-bde2cf4025e5-kube-api-access-bk6wj\") pod \"auto-csr-approver-29533692-6v4zs\" (UID: \"d745be80-4f6d-45af-a161-bde2cf4025e5\") " pod="openshift-infra/auto-csr-approver-29533692-6v4zs" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.426794 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk6wj\" (UniqueName: \"kubernetes.io/projected/d745be80-4f6d-45af-a161-bde2cf4025e5-kube-api-access-bk6wj\") pod \"auto-csr-approver-29533692-6v4zs\" (UID: \"d745be80-4f6d-45af-a161-bde2cf4025e5\") " pod="openshift-infra/auto-csr-approver-29533692-6v4zs" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.504197 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533692-6v4zs" Feb 25 12:12:00 crc kubenswrapper[4725]: I0225 12:12:00.989323 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533692-6v4zs"] Feb 25 12:12:01 crc kubenswrapper[4725]: I0225 12:12:01.013283 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 12:12:01 crc kubenswrapper[4725]: I0225 12:12:01.945719 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533692-6v4zs" event={"ID":"d745be80-4f6d-45af-a161-bde2cf4025e5","Type":"ContainerStarted","Data":"4ceb36231b99c166d089fa134a92374060a00c6641569f0106be130ecf163f9a"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515147563434024461 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015147563434017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015147551712016515 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015147551712015465 5ustar corecore